We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent 8070956 commit cc42a44Copy full SHA for cc42a44
README.md
@@ -67,7 +67,7 @@ for token, tool, tool_bool in T.handle_streaming(stream) :
67
- [ ] v0.6.x: Add llama.cpp as backend in addition to APIs
68
- [ ] v0.7.x: Add reverse proxy + server to create a dedicated full relay/backend (like OpenRouter), framework usable as server and client
69
- [ ] v0.8.x: Add PyTorch as backend with `transformers` to deploy a remote server
70
-- [ ] > v0.9.0: Total reduction of dependencies for built-in functions (unless counter-optimizations)
+- [ ] v0.9.x: Total reduction of dependencies for built-in functions (unless counter-optimizations)
71
- [ ] v1.0.0: First complete version in Python without dependencies
72
- [ ] v1.x.x: Reduce dependencies to Python for Rust backend
73
- [ ] v2.0.0: Backend totally in Rust
0 commit comments