You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+3Lines changed: 3 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,6 +3,9 @@ All notable changes to this project will be documented in this file. This change
3
3
4
4
## [Unreleased]
5
5
6
+
### Added (documentation)
7
+
- Microsoft Foundry Local BYOK provider guide in `doc/auth/byok.md`: quick start example, installation instructions, and connection troubleshooting (upstream PR #461).
8
+
6
9
### Added (upstream PR #329 sync)
7
10
- Windows console window hiding: CLI process is spawned with explicit PIPE redirects ensuring the JVM sets `CREATE_NO_WINDOW` on Windows — no console window appears in GUI applications. Equivalent to upstream `windowsHide: true` (upstream PR #329).
| Ollama |`:openai`| Local models via OpenAI-compatible API |
13
+
| Microsoft Foundry Local |`:openai`| Run AI models locally on your device via OpenAI-compatible API |
13
14
| Other OpenAI-compatible |`:openai`| vLLM, LiteLLM, etc. |
14
15
15
16
## Quick Start: Azure AI Foundry
@@ -49,6 +50,37 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
49
50
(println (h/query"Hello!":session session)))
50
51
```
51
52
53
+
## Quick Start: Microsoft Foundry Local
54
+
55
+
[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. No API key is needed.
56
+
57
+
> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `:base-url`.
58
+
59
+
```clojure
60
+
;; No API key needed for local Foundry Local
61
+
;; Replace <PORT> with the port from: foundry service status
62
+
(copilot/with-client-session [session
63
+
{:model"phi-4-mini"
64
+
:provider {:provider-type:openai
65
+
:base-url"http://localhost:<PORT>/v1"}}]
66
+
(println (h/query"Hello!":session session)))
67
+
```
68
+
69
+
To get started with Foundry Local:
70
+
71
+
```bash
72
+
# Windows: Install Foundry Local CLI (requires winget)
73
+
winget install Microsoft.FoundryLocal
74
+
75
+
# macOS / Linux: see https://foundrylocal.ai for installation instructions
76
+
77
+
# Run a model (starts the local server automatically)
78
+
foundry model run phi-4-mini
79
+
80
+
# Check the port the service is running on
81
+
foundry service status
82
+
```
83
+
52
84
## Quick Start: Anthropic
53
85
54
86
```clojure
@@ -187,6 +219,21 @@ However, if your Azure AI Foundry deployment provides an OpenAI-compatible endpo
0 commit comments