You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## Description
<!-- Provide a concise and descriptive summary of the changes
implemented in this PR. -->
### Type of change
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing
functionality to not work as expected)
- [ ] Documentation update (improves or adds clarity to existing
documentation)
### Tested on
- [ ] iOS
- [ ] Android
### Testing instructions
<!-- Provide step-by-step instructions on how to test your changes.
Include setup details if necessary. -->
### Screenshots
<!-- Add screenshots here, if applicable -->
### Related issues
<!-- Link related issues here using #issue-number -->
### Checklist
- [ ] I have performed a self-review of my code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I have updated the documentation accordingly
- [ ] My changes generate no new warnings
### Additional notes
<!-- Include any additional information, assumptions, or context that
reviewers might need to understand this PR. -->
|`generate`|`(messages: Message[], tools?: LLMTool[]) => Promise<string>`| Runs model to complete chat passed in `messages` argument. It doesn't manage conversation context. |
46
-
|`forward`|`(input: string) => Promise<string>`| Runs model inference with raw input string. You need to provide entire conversation and prompt (in correct format and with special tokens!) in input string to this method. It doesn't manage conversation context. It is intended for users that need access to the model itself without any wrapper. If you want a simple chat with model the consider using`sendMessage`|
47
-
|`configure`|`({chatConfig?: Partial<ChatConfig>, toolsConfig?: ToolsConfig}) => void`| Configures chat and tool calling. See more details in [configuring the model](#configuring-the-model). |
48
-
|`sendMessage`|`(message: string) => Promise<Message[]>`| Method to add user message to conversation. After model responds it will call `messageHistoryCallback()`containing both user message and model response. It also returns them. |
49
-
|`deleteMessage`|`(index: number) => void`| Deletes all messages starting with message on `index` position. After deletion it will call `messageHistoryCallback()` containing new history. It also returns it. |
50
-
|`delete`|`() => void`| Method to delete the model from memory. Note you cannot delete model while it's generating. You need to interrupt it first and make sure model stopped generation. |
51
-
|`interrupt`|`() => void`| Interrupts model generation. It may return one more token after interrupt. |
|`generate`|`(messages: Message[], tools?: LLMTool[]) => Promise<string>`| Runs model to complete chat passed in `messages` argument. It doesn't manage conversation context. |
46
+
|`forward`|`(input: string) => Promise<string>`| Runs model inference with raw input string. You need to provide entire conversation and prompt (in correct format and with special tokens!) in input string to this method. It doesn't manage conversation context. It is intended for users that need access to the model itself without any wrapper. If you want a simple chat with model the consider using`sendMessage`|
47
+
|`configure`|`({chatConfig?: Partial<ChatConfig>, toolsConfig?: ToolsConfig}) => void`| Configures chat and tool calling. See more details in [configuring the model](#configuring-the-model). |
48
+
|`sendMessage`|`(message: string) => Promise<Message[]>`| Method to add user message to conversation. After model responds it will call `messageHistoryCallback()`containing both user message and model response. It also returns them. |
49
+
|`deleteMessage`|`(index: number) => void`| Deletes all messages starting with message on `index` position. After deletion it will call `messageHistoryCallback()` containing new history. It also returns it. |
50
+
|`delete`|`() => void`| Method to delete the model from memory. Note you cannot delete model while it's generating. You need to interrupt it first and make sure model stopped generation. |
51
+
|`interrupt`|`() => void`| Interrupts model generation. It may return one more token after interrupt. |
0 commit comments