You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
docs: update API reference with getPromptTokenCount and getTotalTokenCount (#789)
## Description
<!-- Provide a concise and descriptive summary of the changes
implemented in this PR. -->
### Introduces a breaking change?
- [ ] Yes
- [ ] No
### Type of change
- [ ] Bug fix (change which fixes an issue)
- [ ] New feature (change which adds functionality)
- [ ] Documentation update (improves or adds clarity to existing
documentation)
- [ ] Other (chores, tests, code style improvements etc.)
### Tested on
- [ ] iOS
- [ ] Android
### Testing instructions
<!-- Provide step-by-step instructions on how to test your changes.
Include setup details if necessary. -->
### Screenshots
<!-- Add screenshots here, if applicable -->
### Related issues
<!-- Link related issues here using #issue-number -->
### Checklist
- [ ] I have performed a self-review of my code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I have updated the documentation accordingly
- [ ] My changes generate no new warnings
### Additional notes
<!-- Include any additional information, assumptions, or context that
reviewers might need to understand this PR. -->
Copy file name to clipboardExpand all lines: docs/docs/06-api-reference/classes/LLMModule.md
+44-12Lines changed: 44 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# Class: LLMModule
2
2
3
-
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:10](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L10)
3
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:10](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L10)
4
4
5
5
Module for managing a Large Language Model (LLM) instance.
6
6
@@ -10,7 +10,7 @@ Module for managing a Large Language Model (LLM) instance.
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:19](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L19)
13
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:19](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L19)
14
14
15
15
Creates a new instance of `LLMModule` with optional callbacks.
16
16
@@ -45,7 +45,7 @@ A new LLMModule instance.
45
45
46
46
> **configure**(`configuration`): `void`
47
47
48
-
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:81](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L81)
48
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:81](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L81)
49
49
50
50
Configures chat and tool calling and generation settings.
51
51
See [Configuring the model](https://docs.swmansion.com/react-native-executorch/docs/hooks/natural-language-processing/useLLM#configuring-the-model) for details.
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:156](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L156)
71
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:174](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L174)
72
72
73
73
Method to delete the model from memory.
74
74
Note you cannot delete model while it's generating.
@@ -84,7 +84,7 @@ You need to interrupt it first and make sure model stopped generation.
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:130](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L130)
87
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:130](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L130)
88
88
89
89
Deletes all messages starting with message on `index` position.
90
90
After deletion it will call `messageHistoryCallback()` containing new history.
@@ -110,7 +110,7 @@ The index of the message to delete from history.
110
110
111
111
> **forward**(`input`): `Promise`\<`string`\>
112
112
113
-
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:94](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L94)
113
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:94](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L94)
114
114
115
115
Runs model inference with raw input string.
116
116
You need to provide entire conversation and prompt (in correct format and with special tokens!) in input string to this method.
@@ -137,7 +137,7 @@ The generated response as a string.
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:105](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L105)
140
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:105](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L105)
141
141
142
142
Runs model to complete chat passed in `messages` argument. It doesn't manage conversation context.
143
143
@@ -167,7 +167,7 @@ The generated response as a string.
167
167
168
168
> **getGeneratedTokenCount**(): `number`
169
169
170
-
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:147](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L147)
170
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:147](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L147)
171
171
172
172
Returns the number of tokens generated in the last response.
173
173
@@ -179,11 +179,43 @@ The count of generated tokens.
179
179
180
180
---
181
181
182
+
### getPromptTokensCount()
183
+
184
+
> **getPromptTokensCount**(): `number`
185
+
186
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:156](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L156)
187
+
188
+
Returns the number of prompt tokens in the last message.
189
+
190
+
#### Returns
191
+
192
+
`number`
193
+
194
+
The count of prompt token.
195
+
196
+
---
197
+
198
+
### getTotalTokensCount()
199
+
200
+
> **getTotalTokensCount**(): `number`
201
+
202
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:165](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L165)
203
+
204
+
Returns the number of total tokens from the previous generation. This is a sum of prompt tokens and generated tokens.
205
+
206
+
#### Returns
207
+
208
+
`number`
209
+
210
+
The count of prompt and generated tokens.
211
+
212
+
---
213
+
182
214
### interrupt()
183
215
184
216
> **interrupt**(): `void`
185
217
186
-
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:138](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L138)
218
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:138](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L138)
187
219
188
220
Interrupts model generation. It may return one more token after interrupt.
189
221
@@ -197,7 +229,7 @@ Interrupts model generation. It may return one more token after interrupt.
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:48](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L48)
232
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:48](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L48)
201
233
202
234
Loads the LLM model and tokenizer.
203
235
@@ -241,7 +273,7 @@ Optional callback to track download progress (value between 0 and 1).
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:117](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L117)
276
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:117](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L117)
245
277
246
278
Method to add user message to conversation.
247
279
After model responds it will call `messageHistoryCallback()` containing both user message and model response.
@@ -267,7 +299,7 @@ The message string to send.
267
299
268
300
> **setTokenCallback**(`tokenCallback`): `void`
269
301
270
-
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:67](https://github.com/software-mansion/react-native-executorch/blob/4ee3121e1a18c982703726f1f72421920ed523a4/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L67)
302
+
Defined in: [packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts:67](https://github.com/software-mansion/react-native-executorch/blob/345f048951557e9c3ca349383a9fe6e94974f3df/packages/react-native-executorch/src/modules/natural_language_processing/LLMModule.ts#L67)
271
303
272
304
Sets new token callback invoked on every token batch.
0 commit comments