You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`llmapi` provides a typed `Client<Provider>` API for chat, streaming, embeddings, tool calls, and conversation persistence. The repository ships built-in providers for OpenAI and Anthropic, and the OpenAI provider can target compatible endpoints through a custom `baseUrl`.
14
+
`llmapi` provides a typed `Client<Provider>` API for chat, streaming, embeddings, tool calls, and conversation persistence. The default config alias `Config` maps to OpenAI-style providers, so the common case does not need an explicit `openai::OpenAI` wrapper.
15
15
16
16
## Features
17
17
@@ -37,10 +37,10 @@ int main() {
37
37
return 1;
38
38
}
39
39
40
-
auto client = Client(openai::OpenAI({
40
+
auto client = Client(Config{
41
41
.apiKey = apiKey,
42
42
.model = "gpt-4o-mini",
43
-
}));
43
+
});
44
44
45
45
client.system("You are a concise assistant.");
46
46
auto resp = client.chat("Explain why C++23 modules are useful in two sentences.");
@@ -54,6 +54,7 @@ int main() {
54
54
55
55
-`openai::OpenAI` for OpenAI chat, streaming, embeddings, and OpenAI-compatible endpoints
56
56
-`anthropic::Anthropic` for Anthropic chat and streaming
57
+
-`Config` as a convenient alias for `openai::Config`
57
58
58
59
Compatible endpoints can reuse the OpenAI provider:
0 commit comments