Skip to content

Commit f7cbf52

Browse files
authored
Merge pull request #30 from copilot-community-sdk/upstream-sync/2026-02-23-abdea8a64bcb0217
[upstream-sync] Port upstream documentation: Microsoft Foundry Local BYOK provider (upstream PR #461)
2 parents e5a010c + 6d44ac3 commit f7cbf52

2 files changed

Lines changed: 50 additions & 0 deletions

File tree

CHANGELOG.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,9 @@ All notable changes to this project will be documented in this file. This change
33

44
## [Unreleased]
55

6+
### Added (documentation)
7+
- Microsoft Foundry Local BYOK provider guide in `doc/auth/byok.md`: quick start example, installation instructions, and connection troubleshooting (upstream PR #461).
8+
69
### Added (upstream PR #329 sync)
710
- Windows console window hiding: CLI process is spawned with explicit PIPE redirects ensuring the JVM sets `CREATE_NO_WINDOW` on Windows — no console window appears in GUI applications. Equivalent to upstream `windowsHide: true` (upstream PR #329).
811

doc/auth/byok.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
1010
| Azure OpenAI / Azure AI Foundry | `:azure` | Azure-hosted models |
1111
| Anthropic | `:anthropic` | Claude models |
1212
| Ollama | `:openai` | Local models via OpenAI-compatible API |
13+
| Microsoft Foundry Local | `:openai` | Run AI models locally on your device via OpenAI-compatible API |
1314
| Other OpenAI-compatible | `:openai` | vLLM, LiteLLM, etc. |
1415

1516
## Quick Start: Azure AI Foundry
@@ -49,6 +50,37 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
4950
(println (h/query "Hello!" :session session)))
5051
```
5152

53+
## Quick Start: Microsoft Foundry Local
54+
55+
[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. No API key is needed.
56+
57+
> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `:base-url`.
58+
59+
```clojure
60+
;; No API key needed for local Foundry Local
61+
;; Replace <PORT> with the port from: foundry service status
62+
(copilot/with-client-session [session
63+
{:model "phi-4-mini"
64+
:provider {:provider-type :openai
65+
:base-url "http://localhost:<PORT>/v1"}}]
66+
(println (h/query "Hello!" :session session)))
67+
```
68+
69+
To get started with Foundry Local:
70+
71+
```bash
72+
# Windows: Install Foundry Local CLI (requires winget)
73+
winget install Microsoft.FoundryLocal
74+
75+
# macOS / Linux: see https://foundrylocal.ai for installation instructions
76+
77+
# Run a model (starts the local server automatically)
78+
foundry model run phi-4-mini
79+
80+
# Check the port the service is running on
81+
foundry service status
82+
```
83+
5284
## Quick Start: Anthropic
5385

5486
```clojure
@@ -187,6 +219,21 @@ However, if your Azure AI Foundry deployment provides an OpenAI-compatible endpo
187219
:base-url "https://your-resource.openai.azure.com/openai/v1/"}}
188220
```
189221

222+
### Connection Refused (Foundry Local)
223+
224+
Foundry Local uses a dynamic port that may change between restarts. Confirm the active port:
225+
226+
```bash
227+
# Check the service status and port
228+
foundry service status
229+
```
230+
231+
Update your `:base-url` to match the port shown in the output. If the service is not running, start a model to launch it:
232+
233+
```bash
234+
foundry model run phi-4-mini
235+
```
236+
190237
### Connection Refused (Ollama)
191238

192239
Ensure Ollama is running and accessible:

0 commit comments

Comments
 (0)