Skip to content

Commit 0f71e38

Browse files
authored
fix: Fix markdown tables (#413)
## Description <!-- Provide a concise and descriptive summary of the changes implemented in this PR. --> ### Type of change - [ ] Bug fix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected) - [ ] Documentation update (improves or adds clarity to existing documentation) ### Tested on - [ ] iOS - [ ] Android ### Testing instructions <!-- Provide step-by-step instructions on how to test your changes. Include setup details if necessary. --> ### Screenshots <!-- Add screenshots here, if applicable --> ### Related issues <!-- Link related issues here using #issue-number --> ### Checklist - [ ] I have performed a self-review of my code - [ ] I have commented my code, particularly in hard-to-understand areas - [ ] I have updated the documentation accordingly - [ ] My changes generate no new warnings ### Additional notes <!-- Include any additional information, assumptions, or context that reviewers might need to understand this PR. -->
1 parent 6b2d458 commit 0f71e38

2 files changed

Lines changed: 22 additions & 22 deletions

File tree

docs/docs/typescript-api/LLMModule.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -38,17 +38,17 @@ LLMModule.delete();
3838

3939
### Methods
4040

41-
| Method | Type | Description |
42-
| ------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------ |
43-
| `load` | `({ modelSource: ResourceSource, tokenizerSource: ResourceSource, tokenizerConfigSource: ResourceSource, onDownloadProgressCallback?: (downloadProgress: number) => void, tokenCallback?: (token: string) => void, responseCallback?: (response: string) => void, messageHistoryCallback?: (messageHistory: Message[]) => void}) => Promise<void>` | Loads the model. Checkout the [loading the model](#loading-the-model) section for details. |
44-
| `setTokenCallback` | `{tokenCallback: (token: string | null) => void}) => void` | Sets new token callback. |
45-
| `generate` | `(messages: Message[], tools?: LLMTool[]) => Promise<string>` | Runs model to complete chat passed in `messages` argument. It doesn't manage conversation context. |
46-
| `forward` | `(input: string) => Promise<string>` | Runs model inference with raw input string. You need to provide entire conversation and prompt (in correct format and with special tokens!) in input string to this method. It doesn't manage conversation context. It is intended for users that need access to the model itself without any wrapper. If you want a simple chat with model the consider using`sendMessage` |
47-
| `configure` | `({chatConfig?: Partial<ChatConfig>, toolsConfig?: ToolsConfig}) => void` | Configures chat and tool calling. See more details in [configuring the model](#configuring-the-model). |
48-
| `sendMessage` | `(message: string) => Promise<Message[]>` | Method to add user message to conversation. After model responds it will call `messageHistoryCallback()`containing both user message and model response. It also returns them. |
49-
| `deleteMessage` | `(index: number) => void` | Deletes all messages starting with message on `index` position. After deletion it will call `messageHistoryCallback()` containing new history. It also returns it. |
50-
| `delete` | `() => void` | Method to delete the model from memory. Note you cannot delete model while it's generating. You need to interrupt it first and make sure model stopped generation. |
51-
| `interrupt` | `() => void` | Interrupts model generation. It may return one more token after interrupt. |
41+
| Method | Type | Description |
42+
| ------------------ | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
43+
| `load` | `({ modelSource: ResourceSource, tokenizerSource: ResourceSource, tokenizerConfigSource: ResourceSource, onDownloadProgressCallback?: (downloadProgress: number) => void, tokenCallback?: (token: string) => void, responseCallback?: (response: string) => void, messageHistoryCallback?: (messageHistory: Message[]) => void}) => Promise<void>` | Loads the model. Checkout the [loading the model](#loading-the-model) section for details. |
44+
| `setTokenCallback` | `{tokenCallback: (token: string) => void}) => void` | Sets new token callback. |
45+
| `generate` | `(messages: Message[], tools?: LLMTool[]) => Promise<string>` | Runs model to complete chat passed in `messages` argument. It doesn't manage conversation context. |
46+
| `forward` | `(input: string) => Promise<string>` | Runs model inference with raw input string. You need to provide entire conversation and prompt (in correct format and with special tokens!) in input string to this method. It doesn't manage conversation context. It is intended for users that need access to the model itself without any wrapper. If you want a simple chat with model the consider using`sendMessage` |
47+
| `configure` | `({chatConfig?: Partial<ChatConfig>, toolsConfig?: ToolsConfig}) => void` | Configures chat and tool calling. See more details in [configuring the model](#configuring-the-model). |
48+
| `sendMessage` | `(message: string) => Promise<Message[]>` | Method to add user message to conversation. After model responds it will call `messageHistoryCallback()`containing both user message and model response. It also returns them. |
49+
| `deleteMessage` | `(index: number) => void` | Deletes all messages starting with message on `index` position. After deletion it will call `messageHistoryCallback()` containing new history. It also returns it. |
50+
| `delete` | `() => void` | Method to delete the model from memory. Note you cannot delete model while it's generating. You need to interrupt it first and make sure model stopped generation. |
51+
| `interrupt` | `() => void` | Interrupts model generation. It may return one more token after interrupt. |
5252

5353
<details>
5454
<summary>Type definitions</summary>

0 commit comments

Comments
 (0)