Skip to content

Commit 3eed749

Browse files
samuel100Copilotbaijumeswani
authored
Samuel100/update readmes (#573)
Update README to better align the product offering. Updated sample READMEs. --------- Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> Co-authored-by: Baiju Meswani <baijumeswani@gmail.com>
1 parent 23477c8 commit 3eed749

File tree

55 files changed

+394
-1380
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

55 files changed

+394
-1380
lines changed

README.md

Lines changed: 93 additions & 222 deletions
Large diffs are not rendered by default.

samples/README.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
# Foundry Local Samples
2+
3+
Explore complete working examples that demonstrate how to use Foundry Local — an end-to-end local AI solution that runs entirely on-device. These samples cover chat completions, audio transcription, tool calling, LangChain integration, and more.
4+
5+
> **New to Foundry Local?** Check out the [main README](../README.md) for an overview and quickstart, or visit the [Foundry Local documentation](https://learn.microsoft.com/azure/foundry-local/) on Microsoft Learn.
6+
7+
## Samples by Language
8+
9+
| Language | Samples | Description |
10+
|----------|---------|-------------|
11+
| [**C#**](cs/) | 12 | .NET SDK samples including native chat, audio transcription, tool calling, model management, web server, and tutorials. Uses WinML on Windows for hardware acceleration. |
12+
| [**JavaScript**](js/) | 12 | Node.js SDK samples including native chat, audio transcription, Electron desktop app, Copilot SDK integration, LangChain, tool calling, web server, and tutorials. |
13+
| [**Python**](python/) | 9 | Python samples using the OpenAI-compatible API, including chat, audio transcription, LangChain integration, tool calling, web server, and tutorials. |
14+
| [**Rust**](rust/) | 8 | Rust SDK samples including native chat, audio transcription, tool calling, web server, and tutorials. |

samples/cs/Directory.Packages.props

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,12 @@
11
<Project>
22
<PropertyGroup>
33
<ManagePackageVersionsCentrally>true</ManagePackageVersionsCentrally>
4-
<OnnxRuntimeGenAIVersion>0.13.0-dev-20260319-1131106-439ca0d51</OnnxRuntimeGenAIVersion>
5-
<OnnxRuntimeVersion>1.23.2</OnnxRuntimeVersion>
4+
<CentralPackageFloatingVersionsEnabled>true</CentralPackageFloatingVersionsEnabled>
65
</PropertyGroup>
76
<ItemGroup>
8-
<PackageVersion Include="Microsoft.AI.Foundry.Local" Version="0.9.0-dev" />
9-
<PackageVersion Include="Microsoft.AI.Foundry.Local.WinML" Version="0.9.0-dev-20260324" />
10-
<PackageVersion Include="Betalgo.Ranul.OpenAI" Version="9.1.1" />
7+
<PackageVersion Include="Microsoft.AI.Foundry.Local" Version="*-*" />
8+
<PackageVersion Include="Microsoft.AI.Foundry.Local.WinML" Version="*-*" />
9+
<PackageVersion Include="Betalgo.Ranul.OpenAI" Version="9.2.0" />
1110
<PackageVersion Include="Microsoft.Extensions.Logging" Version="9.0.10" />
1211
<PackageVersion Include="Microsoft.Extensions.Logging.Console" Version="9.0.10" />
1312
<PackageVersion Include="NAudio" Version="2.2.1" />

samples/cs/README.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ Both packages provide the same APIs, so the same source code works on all platfo
2222
| [tutorial-tool-calling](tutorial-tool-calling/) | Create a tool-calling assistant (tutorial). |
2323
| [tutorial-voice-to-text](tutorial-voice-to-text/) | Transcribe and summarize audio (tutorial). |
2424

25+
2526
## Running a sample
2627

2728
1. Clone the repository:
@@ -36,8 +37,3 @@ Both packages provide the same APIs, so the same source code works on all platfo
3637
dotnet run
3738
```
3839

39-
The unified project file automatically selects the correct SDK package for your platform.
40-
41-
> [!TIP]
42-
> On Windows, we recommend using the WinML package (selected automatically) for optimal performance. Your users benefit from a wider range of hardware acceleration options and a smaller application package size.
43-

samples/cs/live-audio-transcription-example/LiveAudioTranscriptionExample.csproj

Lines changed: 0 additions & 55 deletions
This file was deleted.

samples/cs/live-audio-transcription-example/LiveAudioTranscriptionExample.sln

Lines changed: 0 additions & 34 deletions
This file was deleted.

samples/cs/live-audio-transcription-example/Program.cs

Lines changed: 0 additions & 106 deletions
This file was deleted.

samples/cs/nuget.config

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@
44
<clear />
55
<add key="nuget.org" value="https://api.nuget.org/v3/index.json" />
66
<add key="ORT-Nightly" value="https://pkgs.dev.azure.com/aiinfra/PublicPackages/_packaging/ORT-Nightly/nuget/v3/index.json" />
7-
<add key="local-sdk" value="../../local-packages" />
87
</packageSources>
98
<packageSourceMapping>
109
<packageSource key="nuget.org">
@@ -14,9 +13,5 @@
1413
<package pattern="Microsoft.AI.Foundry.Local*" />
1514
<package pattern="Microsoft.ML.OnnxRuntime*" />
1615
</packageSource>
17-
<packageSource key="local-sdk">
18-
<package pattern="Microsoft.AI.Foundry.Local" />
19-
<package pattern="Microsoft.AI.Foundry.Local.WinML" />
20-
</packageSource>
2116
</packageSourceMapping>
2217
</configuration>

samples/js/README.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
# 🚀 Foundry Local JavaScript Samples
2+
3+
These samples demonstrate how to use the Foundry Local JavaScript SDK (`foundry-local-sdk`) with Node.js.
4+
5+
## Prerequisites
6+
7+
- [Node.js](https://nodejs.org/) (v18 or later recommended)
8+
9+
## Samples
10+
11+
| Sample | Description |
12+
|--------|-------------|
13+
| [native-chat-completions](native-chat-completions/) | Initialize the SDK, download a model, and run non-streaming and streaming chat completions. |
14+
| [audio-transcription-example](audio-transcription-example/) | Transcribe audio files using the Whisper model with streaming output. |
15+
| [chat-and-audio-foundry-local](chat-and-audio-foundry-local/) | Unified sample demonstrating both chat and audio transcription in one application. |
16+
| [electron-chat-application](electron-chat-application/) | Full-featured Electron desktop chat app with voice transcription and model management. |
17+
| [copilot-sdk-foundry-local](copilot-sdk-foundry-local/) | GitHub Copilot SDK integration with Foundry Local for agentic AI workflows. |
18+
| [langchain-integration-example](langchain-integration-example/) | LangChain.js integration for building text generation chains. |
19+
| [tool-calling-foundry-local](tool-calling-foundry-local/) | Tool calling with custom function definitions and streaming responses. |
20+
| [web-server-example](web-server-example/) | Start a local OpenAI-compatible web server and call it with the OpenAI SDK. |
21+
| [tutorial-chat-assistant](tutorial-chat-assistant/) | Build an interactive multi-turn chat assistant (tutorial). |
22+
| [tutorial-document-summarizer](tutorial-document-summarizer/) | Summarize documents with AI (tutorial). |
23+
| [tutorial-tool-calling](tutorial-tool-calling/) | Create a tool-calling assistant (tutorial). |
24+
| [tutorial-voice-to-text](tutorial-voice-to-text/) | Transcribe and summarize audio (tutorial). |
25+
26+
## Running a Sample
27+
28+
1. Clone the repository:
29+
30+
```bash
31+
git clone https://github.com/microsoft/Foundry-Local.git
32+
cd Foundry-Local/samples/js
33+
```
34+
35+
1. Navigate to a sample and install dependencies:
36+
37+
```bash
38+
cd native-chat-completions
39+
npm install
40+
```
41+
42+
1. Run the sample:
43+
44+
```bash
45+
npm start
46+
```
47+
48+
> [!TIP]
49+
> Each sample's `package.json` includes `foundry-local-sdk` as a dependency and `foundry-local-sdk-winml` as an optional dependency. On **Windows**, the WinML variant installs automatically for broader hardware acceleration. On **macOS and Linux**, the standard SDK is used. Just run `npm install` — platform detection is handled for you.

samples/js/audio-transcription-example/README.md

Lines changed: 0 additions & 38 deletions
This file was deleted.

0 commit comments

Comments
 (0)