|
1 | | -# openai |
2 | | -mcpplibs | `import openai` in modern c++ |
| 1 | +# llmapi |
| 2 | + |
| 3 | +> Modern C++ LLM API client with openai-compatible support |
| 4 | +
|
| 5 | +[](https://en.cppreference.com/w/cpp/23) |
| 6 | +[](https://en.cppreference.com/w/cpp/23) |
| 7 | +[](https://en.cppreference.com/w/cpp/language/modules) |
| 8 | +[](LICENSE) |
| 9 | +[](https://platform.openai.com/docs/api-reference) |
| 10 | + |
| 11 | +| English - [简体中文](README.zh.md) - [繁體中文](README.zh.hant.md) | |
| 12 | +|:---:| |
| 13 | +| [Documentation](docs/) - [C++ API](docs/cpp-api.md) - [C API](docs/c-api.md) - [Examples](docs/examples.md) | |
| 14 | + |
| 15 | +Clean, type-safe LLM API client using C++23 modules. Fluent interface with zero-cost abstractions. Works with OpenAI, Poe, DeepSeek and compatible endpoints. |
| 16 | + |
| 17 | +## ✨ Features |
| 18 | + |
| 19 | +- **C++23 Modules** - `import mcpplibs.llmapi` |
| 20 | +- **Auto-Save History** - Conversation history managed automatically |
| 21 | +- **Type-Safe Streaming** - Concept-constrained callbacks |
| 22 | +- **Fluent Interface** - Chainable methods |
| 23 | +- **C API** - Full C language support with OOP style |
| 24 | +- **Provider Agnostic** - OpenAI, Poe, and compatible endpoints |
| 25 | + |
| 26 | +## 📦 Quick Start |
| 27 | + |
| 28 | +### C++ API |
| 29 | + |
| 30 | +```cpp |
| 31 | +import std; |
| 32 | +import mcpplibs.llmapi; |
| 33 | + |
| 34 | +int main() { |
| 35 | + using namespace mcpplibs; |
| 36 | + |
| 37 | + llmapi::Client client(std::getenv("OPENAI_API_KEY"), llmapi::URL::Poe); |
| 38 | + |
| 39 | + client.model("gpt-5") |
| 40 | + .system("You are a helpful assistant.") |
| 41 | + .user("In one sentence, introduce modern C++. 并给出中文翻译") |
| 42 | + .request([](std::string_view chunk) { |
| 43 | + std::print("{}", chunk); |
| 44 | + std::cout.flush(); |
| 45 | + }); |
| 46 | + |
| 47 | + return 0; |
| 48 | +} |
| 49 | +``` |
| 50 | + |
| 51 | +### C API |
| 52 | + |
| 53 | +```c |
| 54 | +#include <stdio.h> |
| 55 | + |
| 56 | +#include "llmapi.h" |
| 57 | + |
| 58 | +void stream_print(const char* s, size_t len, void* data) { |
| 59 | + printf("%.*s", (int)len, s); |
| 60 | + fflush(stdout); |
| 61 | +} |
| 62 | + |
| 63 | +int main(void) { |
| 64 | + llmapi_client_t* c = llmapi_client_create(getenv("OPENAI_API_KEY"), LLMAPI_URL_POE); |
| 65 | + |
| 66 | + c->set_model(c, "gpt-5"); |
| 67 | + c->add_system_message(c, "You are a helpful assistant."); |
| 68 | + c->add_user_message(c, "In one sentence, introduce modern C++. 并给出中文翻译"); |
| 69 | + c->request_stream(c, stream_print, NULL); |
| 70 | + |
| 71 | + c->destroy(c); |
| 72 | + return 0; |
| 73 | +} |
| 74 | +``` |
| 75 | +
|
| 76 | +### Models / Providers |
| 77 | +
|
| 78 | +```cpp |
| 79 | +llmapi::Client client(apiKey, llmapi::URL::OpenAI); // OpenAI |
| 80 | +llmapi::Client client(apiKey, llmapi::URL::Poe); // Poe |
| 81 | +llmapi::Client client(apiKey, llmapi::URL::DeepSeek); // Deepseek |
| 82 | +llmapi::Client client(apiKey, "https://custom.com"); // Custom |
| 83 | +``` |
| 84 | + |
| 85 | +## 🛠️ Building |
| 86 | + |
| 87 | +```bash |
| 88 | +xmake # Build |
| 89 | +xmake run basic # Run example(after cofig OPENAI_API_KEY) |
| 90 | +``` |
| 91 | + |
| 92 | +## 📚 API Reference |
| 93 | + |
| 94 | +**C++ Core Methods:** |
| 95 | +- `model(name)` - Set model |
| 96 | +- `user/system/assistant(content)` - Add messages |
| 97 | +- `request()` - Non-streaming (returns JSON) |
| 98 | +- `request(callback)` - Streaming |
| 99 | +- `getAnswer()` - Get last assistant reply |
| 100 | +- `getMessages()` - Get conversation history |
| 101 | +- `clear()` - Clear history |
| 102 | + |
| 103 | +**C API:** All methods available via function pointers (`client->method(client, ...)`) |
| 104 | + |
| 105 | +## 📄 License |
| 106 | + |
| 107 | +Apache-2.0 - see [LICENSE](LICENSE) |
0 commit comments