Skip to content

Commit 5ef5d67

Browse files
authored
Document Microsoft Foundry Local setup and usage
Added section for Microsoft Foundry Local with installation and usage instructions.
1 parent f1d8cc1 commit 5ef5d67

1 file changed

Lines changed: 47 additions & 0 deletions

File tree

docs/auth/byok.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
1010
| Azure OpenAI / Azure AI Foundry | `"azure"` | Azure-hosted models |
1111
| Anthropic | `"anthropic"` | Claude models |
1212
| Ollama | `"openai"` | Local models via OpenAI-compatible API |
13+
| Microsoft Foundry Local | `"openai"` | Run AI models locally on your device via OpenAI-compatible API |
1314
| Other OpenAI-compatible | `"openai"` | vLLM, LiteLLM, etc. |
1415

1516
## Quick Start: Azure AI Foundry
@@ -250,6 +251,36 @@ provider: {
250251
}
251252
```
252253

254+
### Microsoft Foundry Local
255+
256+
[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. Install it via the Foundry Local CLI, then point the SDK at your local endpoint:
257+
258+
```typescript
259+
provider: {
260+
type: "openai",
261+
baseUrl: "http://localhost:<PORT>/v1",
262+
// No apiKey needed for local Foundry Local
263+
}
264+
```
265+
266+
> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `baseUrl`.
267+
268+
To get started with Foundry Local:
269+
270+
```bash
271+
# Install Foundry Local CLI
272+
winget install Microsoft.FoundryLocal
273+
274+
# List available models
275+
foundry model list
276+
277+
# Run a model (starts the local server automatically)
278+
foundry model run phi-4-mini
279+
280+
# Check the port the service is running on
281+
foundry service status
282+
```
283+
253284
### Anthropic
254285

255286
```typescript
@@ -305,6 +336,7 @@ Some Copilot features may behave differently with BYOK:
305336
|----------|-------------|
306337
| Azure AI Foundry | No Entra ID auth; must use API keys |
307338
| Ollama | No API key; local only; model support varies |
339+
| [Microsoft Foundry Local](https://foundrylocal.ai) | Local only; model availability depends on device hardware; no API key required |
308340
| OpenAI | Subject to OpenAI rate limits and quotas |
309341

310342
## Troubleshooting
@@ -368,6 +400,21 @@ curl http://localhost:11434/v1/models
368400
ollama serve
369401
```
370402

403+
### Connection Refused (Foundry Local)
404+
405+
Foundry Local uses a dynamic port that may change between restarts. Confirm the active port:
406+
407+
```bash
408+
# Check the service status and port
409+
foundry service status
410+
```
411+
412+
Update your `baseUrl` to match the port shown in the output. If the service is not running, start a model to launch it:
413+
414+
```bash
415+
foundry model run phi-4-mini
416+
```
417+
371418
### Authentication Failed
372419

373420
1. Verify your API key is correct and not expired

0 commit comments

Comments
 (0)