|
1 | 1 | # llmapi |
2 | 2 |
|
3 | | -> Modern C++ LLM API client with openai-compatible support |
| 3 | +> Modern C++23 LLM client built with modules |
4 | 4 |
|
5 | 5 | [](https://en.cppreference.com/w/cpp/23) |
6 | 6 | [](https://en.cppreference.com/w/cpp/language/modules) |
7 | 7 | [](LICENSE) |
8 | | -[](https://platform.openai.com/docs/api-reference) |
| 8 | +[](https://platform.openai.com/docs/api-reference) |
9 | 9 |
|
10 | 10 | | English - [简体中文](README.zh.md) - [繁體中文](README.zh.hant.md) | |
11 | 11 | |:---:| |
12 | | -| [Documentation](docs/) - [C++ API](docs/cpp-api.md) - [C API](docs/c-api.md) - [Examples](docs/examples.md) | |
| 12 | +| [Documentation](docs/) - [C++ API](docs/cpp-api.md) - [Examples](docs/examples.md) | |
13 | 13 |
|
14 | | -Clean, type-safe LLM API client using C++23 modules. Fluent interface with zero-cost abstractions. Works with OpenAI, Poe, DeepSeek and compatible endpoints. |
| 14 | +`llmapi` provides a typed `Client<Provider>` API for chat, streaming, embeddings, tool calls, and conversation persistence. The default config alias `Config` maps to OpenAI-style providers, so the common case does not need an explicit `openai::OpenAI` wrapper. |
15 | 15 |
|
16 | | -## ✨ Features |
| 16 | +## Features |
17 | 17 |
|
18 | | -- **C++23 Modules** - `import mcpplibs.llmapi` |
19 | | -- **Auto-Save History** - Conversation history managed automatically |
20 | | -- **Type-Safe Streaming** - Concept-constrained callbacks |
21 | | -- **Fluent Interface** - Chainable methods |
22 | | -- **Provider Agnostic** - OpenAI, Poe, and compatible endpoints |
| 18 | +- `import mcpplibs.llmapi` with C++23 modules |
| 19 | +- Strongly typed messages, tools, and response structs |
| 20 | +- Sync, async, and streaming chat APIs |
| 21 | +- Embeddings via the OpenAI provider |
| 22 | +- Conversation save/load helpers |
| 23 | +- OpenAI-compatible endpoint support through `openai::Config::baseUrl` |
23 | 24 |
|
24 | 25 | ## Quick Start |
25 | 26 |
|
26 | 27 | ```cpp |
27 | | -import std; |
28 | 28 | import mcpplibs.llmapi; |
| 29 | +import std; |
29 | 30 |
|
30 | 31 | int main() { |
31 | | - using namespace mcpplibs; |
32 | | - |
33 | | - llmapi::Client client(std::getenv("OPENAI_API_KEY"), llmapi::URL::Poe); |
34 | | - |
35 | | - client.model("gpt-5") |
36 | | - .system("You are a helpful assistant.") |
37 | | - .user("In one sentence, introduce modern C++. 并给出中文翻译") |
38 | | - .request([](std::string_view chunk) { |
39 | | - std::print("{}", chunk); |
40 | | - std::cout.flush(); |
41 | | - }); |
| 32 | + using namespace mcpplibs::llmapi; |
| 33 | + |
| 34 | + auto apiKey = std::getenv("OPENAI_API_KEY"); |
| 35 | + if (!apiKey) { |
| 36 | + std::cerr << "OPENAI_API_KEY not set\n"; |
| 37 | + return 1; |
| 38 | + } |
| 39 | + |
| 40 | + auto client = Client(Config{ |
| 41 | + .apiKey = apiKey, |
| 42 | + .model = "gpt-4o-mini", |
| 43 | + }); |
42 | 44 |
|
| 45 | + client.system("You are a concise assistant."); |
| 46 | + auto resp = client.chat("Explain why C++23 modules are useful in two sentences."); |
| 47 | + |
| 48 | + std::cout << resp.text() << '\n'; |
43 | 49 | return 0; |
44 | 50 | } |
45 | 51 | ``` |
46 | 52 |
|
47 | | -### Models / Providers |
| 53 | +## Providers |
| 54 | + |
| 55 | +- `openai::OpenAI` for OpenAI chat, streaming, embeddings, and OpenAI-compatible endpoints |
| 56 | +- `anthropic::Anthropic` for Anthropic chat and streaming |
| 57 | +- `Config` as a convenient alias for `openai::Config` |
| 58 | + |
| 59 | +Compatible endpoints can reuse the OpenAI provider: |
48 | 60 |
|
49 | 61 | ```cpp |
50 | | -llmapi::Client client(apiKey, llmapi::URL::OpenAI); // OpenAI |
51 | | -llmapi::Client client(apiKey, llmapi::URL::Poe); // Poe |
52 | | -llmapi::Client client(apiKey, llmapi::URL::DeepSeek); // Deepseek |
53 | | -llmapi::Client client(apiKey, "https://custom.com"); // Custom |
| 62 | +auto provider = openai::OpenAI({ |
| 63 | + .apiKey = std::getenv("DEEPSEEK_API_KEY"), |
| 64 | + .baseUrl = std::string(URL::DeepSeek), |
| 65 | + .model = "deepseek-chat", |
| 66 | +}); |
54 | 67 | ``` |
55 | 68 |
|
56 | | -## Building |
| 69 | +## Build And Run |
57 | 70 |
|
58 | 71 | ```bash |
59 | | -xmake # Build |
60 | | -xmake run basic # Run example(after cofig OPENAI_API_KEY) |
| 72 | +xmake |
| 73 | +xmake run hello_mcpp |
| 74 | +xmake run basic |
| 75 | +xmake run chat |
61 | 76 | ``` |
62 | 77 |
|
63 | | -## Use in Build Tools |
64 | | - |
65 | | -### xmake |
| 78 | +## Package Usage |
66 | 79 |
|
67 | 80 | ```lua |
68 | | --- 0 - Add mcpplibs's index repos |
69 | | -add_repositories("mcpplibs-index https://github.com/mcpplibs/llmapi.git") |
70 | | - |
71 | | --- 1 - Add the libraries and versions you need |
| 81 | +add_repositories("mcpplibs-index https://github.com/mcpplibs/mcpplibs-index.git") |
72 | 82 | add_requires("llmapi 0.0.2") |
73 | | -``` |
74 | | - |
75 | | -> More: [mcpplibs-index](https://github.com/mcpplibs/mcpplibs-index) |
76 | 83 |
|
77 | | -### cmake |
78 | | - |
79 | | -``` |
80 | | -todo... |
| 84 | +target("demo") |
| 85 | + set_kind("binary") |
| 86 | + set_languages("c++23") |
| 87 | + set_policy("build.c++.modules", true) |
| 88 | + add_files("src/*.cpp") |
| 89 | + add_packages("llmapi") |
81 | 90 | ``` |
82 | 91 |
|
83 | | -## 📄 License |
| 92 | +See [docs/getting-started.md](docs/getting-started.md) and [docs/providers.md](docs/providers.md) for more setup detail. |
| 93 | + |
| 94 | +## License |
84 | 95 |
|
85 | 96 | Apache-2.0 - see [LICENSE](LICENSE) |
0 commit comments