Skip to content

Commit 442c3cd

Browse files
samuel100Copilot
andauthored
rusk sdk v2 init (#500)
# Add Rust SDK v2 Introduces the Foundry Local Rust SDK (`sdk_v2/rust`), updated Rust samples (`samples/rust`), and CI pipeline support. ## SDK (`sdk_v2/rust`) A safe, async Rust wrapper over the Foundry Local Core native library, providing: - **`FoundryLocalManager`** — singleton entry point that initializes the native engine, exposes the model catalog, and manages the local web service lifecycle. - **`Catalog`** — async model discovery with a 6-hour TTL cache. Lookup by alias or variant ID, with helpful error messages listing available alternatives on miss. - **`Model` / `ModelVariant`** — full model lifecycle: download (with streaming progress), load, unload, cache inspection, and removal. - **`ChatClient`** — OpenAI-compatible chat completions (non-streaming and streaming via `futures_core::Stream`). Supports temperature, top-p/top-k, response format (text, JSON, JSON schema, Lark grammar), and tool calling. - **`AudioClient`** — OpenAI-compatible audio transcription (non-streaming and streaming). Accepts `impl AsRef<Path>` for file paths. - **Error handling** — `thiserror`-based `FoundryLocalError` enum with discriminated variants (`LibraryLoad`, `CommandExecution`, `ModelOperation`, `Validation`, etc.) and a crate-wide `Result<T>` alias. - **FFI bridge** (`detail/core_interop.rs`) — dynamically loads the native `.dll`/`.so`/`.dylib` via `libloading`, with platform-aware memory deallocation and a trampoline pattern for streaming callbacks. Async methods use `tokio::task::spawn_blocking` to keep the runtime unblocked. - **Build script** (`build.rs`) — automatically downloads NuGet packages (Core, ORT, GenAI) at compile time, extracts platform-specific native libraries, and caches them across builds. Supports `winml` and `nightly` feature flags. Re-exports `async-openai` request/response types for convenience so callers don't need a direct dependency. ## Samples (`samples/rust`) Replaces the old `hello-foundry-local` sample with four focused examples: - **`native-chat-completions`** — basic sync + streaming chat completion - **`tool-calling-foundry-local`** — multi-turn tool calling with streaming chunk assembly - **`audio-transcription-example`** — audio file transcription (sync and streaming) - **`foundry-local-webserver`** — starts the local web service and makes requests via the OpenAI-compatible HTTP endpoint Each sample has its own `Cargo.toml`, `README.md`, and is included in the workspace. ## CI (`.github/workflows`) - **`build-rust-steps.yml`** — reusable composite action for Rust SDK build + test steps. - **`foundry-local-sdk-build.yml`** — workflow that triggers on `sdk_v2/rust/**` and `samples/rust/**` changes. ## Testing - Integration tests mirror the JavaScript SDK test suite structure: `manager_test`, `catalog_test`, `model_test`, `model_load_manager_test`, `chat_client_test`, `audio_client_test`. - All tests are `#[ignore]` by default (require native library at runtime) and can be enabled in CI with `--include-ignored`. - Shared test utilities (`tests/common/mod.rs`) provide config, model aliases, and helper functions. - `cargo clippy --all-targets --all-features` passes with zero code warnings. --------- Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
1 parent a2f768b commit 442c3cd

53 files changed

Lines changed: 6166 additions & 166 deletions

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
Lines changed: 112 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,112 @@
1+
name: Build Rust SDK
2+
3+
on:
4+
workflow_call:
5+
inputs:
6+
platform:
7+
required: false
8+
type: string
9+
default: 'ubuntu' # or 'windows' or 'macos'
10+
useWinML:
11+
required: false
12+
type: boolean
13+
default: false
14+
run-integration-tests:
15+
required: false
16+
type: boolean
17+
default: true
18+
19+
permissions:
20+
contents: read
21+
22+
jobs:
23+
build:
24+
runs-on: ${{ inputs.platform }}-latest
25+
26+
defaults:
27+
run:
28+
working-directory: sdk_v2/rust
29+
30+
env:
31+
CARGO_FEATURES: ${{ inputs.useWinML && '--features winml' || '' }}
32+
33+
steps:
34+
- name: Checkout repository
35+
uses: actions/checkout@v4
36+
with:
37+
clean: true
38+
39+
- name: Install Rust toolchain
40+
uses: dtolnay/rust-toolchain@stable
41+
with:
42+
components: clippy, rustfmt
43+
44+
- name: Cache cargo dependencies
45+
uses: Swatinem/rust-cache@v2
46+
with:
47+
workspaces: sdk_v2/rust -> target
48+
49+
- name: Checkout test-data-shared from Azure DevOps
50+
if: ${{ inputs.run-integration-tests }}
51+
shell: pwsh
52+
working-directory: ${{ github.workspace }}/..
53+
run: |
54+
$pat = "${{ secrets.AZURE_DEVOPS_PAT }}"
55+
$encodedPat = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(":$pat"))
56+
57+
# Configure git to use the PAT
58+
git config --global http.https://dev.azure.com.extraheader "AUTHORIZATION: Basic $encodedPat"
59+
60+
# Clone with LFS to parent directory
61+
git lfs install
62+
git clone --depth 1 https://dev.azure.com/microsoft/windows.ai.toolkit/_git/test-data-shared test-data-shared
63+
64+
Write-Host "Clone completed successfully to ${{ github.workspace }}/../test-data-shared"
65+
66+
- name: Checkout specific commit in test-data-shared
67+
if: ${{ inputs.run-integration-tests }}
68+
shell: pwsh
69+
working-directory: ${{ github.workspace }}/../test-data-shared
70+
run: |
71+
Write-Host "Current directory: $(Get-Location)"
72+
git checkout 231f820fe285145b7ea4a449b112c1228ce66a41
73+
if ($LASTEXITCODE -ne 0) {
74+
Write-Error "Git checkout failed."
75+
exit 1
76+
}
77+
Write-Host "`nDirectory contents:"
78+
Get-ChildItem -Recurse -Depth 2 | ForEach-Object { Write-Host " $($_.FullName)" }
79+
80+
- name: Check formatting
81+
run: cargo fmt --all -- --check
82+
83+
# Run Clippy - Rust's official linter for catching common mistakes, enforcing idioms, and improving code quality
84+
- name: Run clippy
85+
run: cargo clippy --all-targets ${{ env.CARGO_FEATURES }} -- -D warnings
86+
87+
- name: Build
88+
run: cargo build ${{ env.CARGO_FEATURES }}
89+
90+
- name: Run unit tests
91+
run: cargo test --lib ${{ env.CARGO_FEATURES }}
92+
93+
- name: Run integration tests
94+
if: ${{ inputs.run-integration-tests }}
95+
run: cargo test --tests ${{ env.CARGO_FEATURES }} -- --include-ignored --test-threads=1 --nocapture
96+
97+
# --allow-dirty allows publishing with uncommitted changes, needed because the build process modifies generated files
98+
- name: Package crate
99+
run: cargo package ${{ env.CARGO_FEATURES }} --allow-dirty
100+
101+
- name: Upload SDK artifact
102+
uses: actions/upload-artifact@v4
103+
with:
104+
name: rust-sdk-${{ inputs.platform }}${{ inputs.useWinML == true && '-winml' || '' }}
105+
path: sdk_v2/rust/target/package/*.crate
106+
107+
- name: Upload flcore logs
108+
uses: actions/upload-artifact@v4
109+
if: always()
110+
with:
111+
name: rust-sdk-${{ inputs.platform }}${{ inputs.useWinML == true && '-winml' || '' }}-logs
112+
path: sdk_v2/rust/logs/**

.github/workflows/foundry-local-sdk-build.yml

Lines changed: 20 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,12 @@ jobs:
2929
version: '0.9.0.${{ github.run_number }}'
3030
platform: 'windows'
3131
secrets: inherit
32+
build-rust-windows:
33+
uses: ./.github/workflows/build-rust-steps.yml
34+
with:
35+
platform: 'windows'
36+
run-integration-tests: true
37+
secrets: inherit
3238

3339
build-cs-windows-WinML:
3440
uses: ./.github/workflows/build-cs-steps.yml
@@ -44,7 +50,14 @@ jobs:
4450
platform: 'windows'
4551
useWinML: true
4652
secrets: inherit
47-
53+
build-rust-windows-WinML:
54+
uses: ./.github/workflows/build-rust-steps.yml
55+
with:
56+
platform: 'windows'
57+
useWinML: true
58+
run-integration-tests: true
59+
secrets: inherit
60+
4861
build-cs-macos:
4962
uses: ./.github/workflows/build-cs-steps.yml
5063
with:
@@ -56,4 +69,10 @@ jobs:
5669
with:
5770
version: '0.9.0.${{ github.run_number }}'
5871
platform: 'macos'
72+
secrets: inherit
73+
build-rust-macos:
74+
uses: ./.github/workflows/build-rust-steps.yml
75+
with:
76+
platform: 'macos'
77+
run-integration-tests: true
5978
secrets: inherit

.github/workflows/rustfmt.yml

Lines changed: 0 additions & 29 deletions
This file was deleted.

samples/rust/Cargo.toml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,8 @@
11
[workspace]
22
members = [
3-
"hello-foundry-local"
3+
"foundry-local-webserver",
4+
"tool-calling-foundry-local",
5+
"native-chat-completions",
6+
"audio-transcription-example",
47
]
58
resolver = "2"

samples/rust/README.md

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,14 +5,21 @@ This directory contains samples demonstrating how to use the Foundry Local Rust
55
## Prerequisites
66

77
- Rust 1.70.0 or later
8-
- Foundry Local installed and available on PATH
98

109
## Samples
1110

12-
### [Hello Foundry Local](./hello-foundry-local)
11+
### [Foundry Local Web Server](./foundry-local-webserver)
1312

14-
A simple example that demonstrates how to:
15-
- Start the Foundry Local service
16-
- Download and load a model
17-
- Send a prompt to the model using the OpenAI-compatible API
18-
- Display the response from the model
13+
Demonstrates how to start a local OpenAI-compatible web server using the SDK, then call it with a standard HTTP client.
14+
15+
### [Native Chat Completions](./native-chat-completions)
16+
17+
Shows both non-streaming and streaming chat completions using the SDK's native chat client.
18+
19+
### [Tool Calling with Foundry Local](./tool-calling-foundry-local)
20+
21+
Demonstrates tool calling with streaming responses, multi-turn conversation, and local tool execution.
22+
23+
### [Audio Transcription](./audio-transcription-example)
24+
25+
Demonstrates audio transcription (non-streaming and streaming) using the `whisper` model.
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
[package]
2+
name = "audio-transcription-example"
3+
version = "0.1.0"
4+
edition = "2021"
5+
description = "Audio transcription example using the Foundry Local Rust SDK"
6+
7+
[dependencies]
8+
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
9+
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
10+
tokio-stream = "0.1"
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Sample: Audio Transcription
2+
3+
This example demonstrates audio transcription (non-streaming and streaming) using the Foundry Local Rust SDK. It uses the `whisper` model to transcribe a WAV audio file.
4+
5+
The `foundry-local-sdk` dependency is referenced via a local path. No crates.io publish is required:
6+
7+
```toml
8+
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
9+
```
10+
11+
Run the application with a path to a WAV file:
12+
13+
```bash
14+
cargo run -- path/to/audio.wav
15+
```
16+
17+
## Using WinML (Windows only)
18+
19+
To use the WinML backend, enable the `winml` feature in `Cargo.toml`:
20+
21+
```toml
22+
foundry-local-sdk = { path = "../../../sdk_v2/rust", features = ["winml"] }
23+
```
24+
25+
No code changes are needed — same API, different backend.
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
// Copyright (c) Microsoft Corporation. All rights reserved.
2+
// Licensed under the MIT License.
3+
4+
use std::env;
5+
use std::io::{self, Write};
6+
7+
use foundry_local_sdk::{FoundryLocalConfig, FoundryLocalManager};
8+
use tokio_stream::StreamExt;
9+
10+
const ALIAS: &str = "whisper-tiny";
11+
12+
#[tokio::main]
13+
async fn main() -> Result<(), Box<dyn std::error::Error>> {
14+
println!("Audio Transcription Example");
15+
println!("===========================\n");
16+
17+
// Accept an audio file path as a CLI argument.
18+
let audio_path = env::args().nth(1).unwrap_or_else(|| {
19+
eprintln!("Usage: cargo run -- <path-to-audio.wav>");
20+
std::process::exit(1);
21+
});
22+
23+
// ── 1. Initialise the manager ────────────────────────────────────────
24+
let manager = FoundryLocalManager::create(FoundryLocalConfig::new("foundry_local_samples"))?;
25+
26+
// ── 2. Pick the whisper model and ensure it is downloaded ────────────
27+
let model = manager.catalog().get_model(ALIAS).await?;
28+
println!("Model: {} (id: {})", model.alias(), model.id());
29+
30+
if !model.is_cached().await? {
31+
println!("Downloading model...");
32+
model
33+
.download(Some(|progress: &str| {
34+
print!("\r {progress}%");
35+
io::stdout().flush().ok();
36+
}))
37+
.await?;
38+
println!();
39+
}
40+
41+
println!("Loading model...");
42+
model.load().await?;
43+
println!("✓ Model loaded\n");
44+
45+
// ── 3. Create an audio client ────────────────────────────────────────
46+
let audio_client = model.create_audio_client();
47+
48+
// ── 4. Non-streaming transcription ───────────────────────────────────
49+
println!("--- Non-streaming transcription ---");
50+
let result = audio_client.transcribe(&audio_path).await?;
51+
println!("Transcription: {}", result.text);
52+
53+
// ── 5. Streaming transcription ───────────────────────────────────────
54+
println!("--- Streaming transcription ---");
55+
print!("Transcription: ");
56+
let mut stream = audio_client.transcribe_streaming(&audio_path).await?;
57+
while let Some(chunk) = stream.next().await {
58+
let chunk = chunk?;
59+
print!("{}", chunk.text);
60+
io::stdout().flush().ok();
61+
}
62+
println!("\n");
63+
64+
// ── 6. Unload the model──────────────────────────────────────────────
65+
println!("Unloading model...");
66+
model.unload().await?;
67+
println!("Done.");
68+
69+
Ok(())
70+
}
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
[package]
2+
name = "foundry-local-webserver"
3+
version = "0.1.0"
4+
edition = "2021"
5+
description = "Example of using the Foundry Local SDK with a local OpenAI-compatible web server"
6+
7+
[dependencies]
8+
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
9+
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
10+
serde_json = "1"
11+
reqwest = { version = "0.12", features = ["json"] }
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Sample: Foundry Local Web Server
2+
3+
This example demonstrates how to start a local OpenAI-compatible web server using the Foundry Local SDK, then call it with a standard HTTP client. This is useful when you want to use the OpenAI REST API directly or integrate with tools that expect an OpenAI-compatible endpoint.
4+
5+
The `foundry-local-sdk` dependency is referenced via a local path. No crates.io publish is required:
6+
7+
```toml
8+
foundry-local-sdk = { path = "../../../sdk_v2/rust" }
9+
```
10+
11+
Run the application:
12+
13+
```bash
14+
cargo run
15+
```
16+
17+
## Using WinML (Windows only)
18+
19+
To use the WinML backend, enable the `winml` feature in `Cargo.toml`:
20+
21+
```toml
22+
foundry-local-sdk = { path = "../../../sdk_v2/rust", features = ["winml"] }
23+
```
24+
25+
No code changes are needed — same API, different backend.

0 commit comments

Comments
 (0)