@@ -33,26 +33,26 @@ for token, tool, tool_bool in T.handle_streaming(stream) :
3333
3434## Documentation :
3535
36- ```
36+ ``` bash
3737.
3838├── __version__ = " 0.0.3_genesis"
3939│
4040├── clients
41- │ ├── veniceai(api_key: str) -> openai.OpenAI
42- │ ├── deepseek(api_key: str) -> openai.OpenAI
43- │ ├── openrouter(api_key: str) -> openai.OpenAI
41+ │ ├── veniceai(api_key:str) -> openai.OpenAI
42+ │ ├── deepseek(api_key:str) -> openai.OpenAI
43+ │ ├── openrouter(api_key:str) -> openai.OpenAI
4444│ │
4545│ ├── veniceai_request(client:openai.OpenAI, messages:list[dict], model:str, temperature:float, max_tokens:int, tools: list[dict], include_venice_system_prompt:bool=False, ** kwargs) -> openai.Stream
4646│ ├── generic_request(client:openai.OpenAI, messages:list[dict], model:str, temperature:float, max_tokens:int, tools:list[dict], ** kwargs) -> openai.Stream
4747│ └── openrouter_request(client:openai.OpenAI, messages:list[dict], model:str, temperature:float, max_tokens:int, tools:list[dict], ** kwargs) -> openai.Stream
4848│
49- ├── handle_streaming(stream: openai.Stream) -> generator(token: str or None, tool: list, tool_bool: bool)
50- ├── handle_tool_call(tool_call: dict) -> tuple[str, str, dict, str]
49+ ├── handle_streaming(stream:openai.Stream) -> generator(token:str| None, tool:list[dict] | None , tool_bool:bool)
50+ ├── handle_tool_call(tool_call:dict) -> tuple[str, str, dict, str]
5151│
52- ├── create_assistant_response(content: str, tool_calls: list[dict]=None) -> dict
53- ├── create_function_response(id: str, result: str, name: str) -> dict
54- ├── create_system_prompt(content: str) -> dict[str, str]
55- └── create_user_prompt(content: str) -> dict[str, str]
52+ ├── create_assistant_response(content:str, tool_calls:list[dict]=None) -> dict[str, str]
53+ ├── create_function_response(id:str, result:str, name:str) -> dict[str, str, str]
54+ ├── create_system_prompt(content:str) -> dict[str, str]
55+ └── create_user_prompt(content:str) -> dict[str, str]
5656```
5757
5858## Roadmap
0 commit comments