Skip to content

Commit cdcd81f

Browse files
committed
Document isolated-client concurrency model
1 parent da3c9e1 commit cdcd81f

File tree

7 files changed

+102
-1
lines changed

7 files changed

+102
-1
lines changed

docs/en/advanced.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,35 @@ auto resp = task.get();
5757
std::cout << resp.text() << '\n';
5858
```
5959

60+
## Concurrency Model
61+
62+
Use instance isolation for concurrency:
63+
64+
- `Client` is stateful and not thread-safe
65+
- `tinyhttps::HttpClient` is also not thread-safe
66+
- create one `Client` per task or per thread
67+
- do not share a single `Client` across concurrent callers
68+
69+
This works well for calling multiple providers in parallel because each client owns its own provider, conversation, and transport state.
70+
71+
```cpp
72+
auto futureA = std::async(std::launch::async, [&] {
73+
auto client = Client(Config{
74+
.apiKey = std::getenv("OPENAI_API_KEY"),
75+
.model = "gpt-4o-mini",
76+
});
77+
return client.chat("summarize this");
78+
});
79+
80+
auto futureB = std::async(std::launch::async, [&] {
81+
auto client = Client(AnthropicConfig{
82+
.apiKey = std::getenv("ANTHROPIC_API_KEY"),
83+
.model = "claude-sonnet-4-20250514",
84+
});
85+
return client.chat("translate this");
86+
});
87+
```
88+
6089
## Tool Calling Loop
6190

6291
The provider surfaces requested tools via `ChatResponse::tool_calls()`. You then append a tool result and continue the conversation.

docs/en/cpp-api.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -160,6 +160,12 @@ const P& provider() const
160160
P& provider()
161161
```
162162

163+
### Thread-Safety
164+
165+
- `Client<P>` is stateful and not thread-safe
166+
- use one client per task or thread
167+
- do not share one client across concurrent callers
168+
163169
## Provider Config Types
164170

165171
```cpp

docs/zh-hant/advanced.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,17 @@ auto task = client.chat_async("簡要解釋 coroutine。");
2222
auto resp = task.get();
2323
```
2424

25+
## 並發模型
26+
27+
建議使用「實例隔離、上層並發」的方式:
28+
29+
- `Client` 是有狀態物件,不保證執行緒安全
30+
- `tinyhttps::HttpClient` 也不保證執行緒安全
31+
- 每個任務或執行緒各自建立一個 `Client`
32+
- 不要把同一個 `Client` 共享給多個並發呼叫方
33+
34+
這種方式天然適合多模型 / 多 provider 並發呼叫,因為每個實例都持有獨立的 provider、對話與傳輸狀態。
35+
2536
## 工具呼叫流程
2637

2738
```cpp

docs/zh/advanced.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,17 @@ auto task = client.chat_async("简要解释 coroutine。");
2222
auto resp = task.get();
2323
```
2424

25+
## 并发模型
26+
27+
推荐使用“实例隔离,上层并发”的方式:
28+
29+
- `Client` 是有状态对象,不保证线程安全
30+
- `tinyhttps::HttpClient` 也不保证线程安全
31+
- 每个任务或线程单独创建一个 `Client`
32+
- 不要把同一个 `Client` 共享给多个并发调用方
33+
34+
这种方式天然适合多个模型 / 多个 provider 并发调用,因为每个实例都持有独立的 provider、会话和传输状态。
35+
2536
## 工具调用循环
2637

2738
```cpp

src/client.cppm

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@ private:
1717
ChatParams defaultParams_;
1818

1919
public:
20+
// Thread-safety: Client instances are intentionally stateful and not synchronized.
21+
// Use one Client per task/thread and avoid sharing a Client across threads.
2022
explicit Client(P provider) : provider_(std::move(provider)) {}
2123
explicit Client(openai::Config config)
2224
requires std::same_as<P, openai::OpenAI>

src/tinyhttps/http.cppm

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -219,6 +219,8 @@ static bool iequals(std::string_view a, std::string_view b) {
219219

220220
export class HttpClient {
221221
public:
222+
// Thread-safety: HttpClient owns a mutable connection pool and is not synchronized.
223+
// Keep each instance isolated to a single caller/task unless you add external locking.
222224
explicit HttpClient(HttpClientConfig config = {})
223225
: config_(std::move(config)) {}
224226

tests/llmapi/test_client.cpp

Lines changed: 41 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,15 @@ import std;
77
using namespace mcpplibs::llmapi;
88

99
struct FullMockProvider {
10+
std::string prefix { "reply to: " };
11+
int delayMs { 0 };
12+
1013
std::string_view name() const { return "full_mock"; }
1114

1215
ChatResponse chat(const std::vector<Message>& msgs, const ChatParams&) {
16+
if (delayMs > 0) {
17+
std::this_thread::sleep_for(std::chrono::milliseconds(delayMs));
18+
}
1319
std::string lastContent;
1420
if (!msgs.empty()) {
1521
auto& c = msgs.back().content;
@@ -18,7 +24,7 @@ struct FullMockProvider {
1824
}
1925
}
2026
return ChatResponse {
21-
.content = { TextContent { "reply to: " + lastContent } },
27+
.content = { TextContent { prefix + lastContent } },
2228
.stopReason = StopReason::EndOfTurn,
2329
.usage = { .inputTokens = 10, .outputTokens = 5, .totalTokens = 15 },
2430
};
@@ -96,6 +102,40 @@ int main() {
96102
assert(client2.conversation().size() == 2);
97103
std::filesystem::remove("/tmp/test_client_conv.json");
98104

105+
// Test 8: isolated clients can be used concurrently without sharing conversation state
106+
auto futureA = std::async(std::launch::async, [] {
107+
auto isolatedClient = Client(FullMockProvider{
108+
.prefix = "openai-like: ",
109+
.delayMs = 10,
110+
});
111+
isolatedClient.system("provider a");
112+
auto resp = isolatedClient.chat("hello from a");
113+
return std::pair{
114+
resp.text(),
115+
isolatedClient.conversation().size(),
116+
};
117+
});
118+
119+
auto futureB = std::async(std::launch::async, [] {
120+
auto isolatedClient = Client(FullMockProvider{
121+
.prefix = "anthropic-like: ",
122+
.delayMs = 10,
123+
});
124+
isolatedClient.system("provider b");
125+
auto resp = isolatedClient.chat("hello from b");
126+
return std::pair{
127+
resp.text(),
128+
isolatedClient.conversation().size(),
129+
};
130+
});
131+
132+
auto [textA, sizeA] = futureA.get();
133+
auto [textB, sizeB] = futureB.get();
134+
assert(textA == "openai-like: hello from a");
135+
assert(textB == "anthropic-like: hello from b");
136+
assert(sizeA == 3);
137+
assert(sizeB == 3);
138+
99139
println("test_client: ALL PASSED");
100140
return 0;
101141
}

0 commit comments

Comments
 (0)