-
Notifications
You must be signed in to change notification settings - Fork 68
feat: added downloads tracking separate from huggingface #990
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
df8b3c7
e0211d9
9342b56
4932515
01f841b
8249cd1
d4cdf85
5028267
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,37 @@ | ||
| --- | ||
| title: Model Registry | ||
| --- | ||
|
|
||
| The [Model Registry](/react-native-executorch/docs/next/api-reference/variables/MODEL_REGISTRY) is a collection of all pre-configured model definitions shipped with React Native ExecuTorch. Each entry contains the model's name and all source URLs needed to download and run it, so you don't have to manage URLs manually. | ||
|
|
||
| ## Usage | ||
|
|
||
| ```typescript | ||
| import { MODEL_REGISTRY, LLAMA3_2_1B } from 'react-native-executorch'; | ||
| ``` | ||
|
|
||
| ### Accessing a model directly | ||
|
|
||
| Every model config is exported as a standalone constant: | ||
|
|
||
| ```typescript | ||
| import { LLAMA3_2_1B } from 'react-native-executorch'; | ||
|
|
||
| const llm = useLLM({ model: LLAMA3_2_1B }); | ||
| ``` | ||
|
|
||
| ### Listing all models | ||
|
|
||
| Use `MODEL_REGISTRY` to discover and enumerate all available models: | ||
|
|
||
| ```typescript | ||
| import { MODEL_REGISTRY } from 'react-native-executorch'; | ||
|
|
||
| // Get all model names | ||
| const names = Object.values(MODEL_REGISTRY.ALL_MODELS).map((m) => m.modelName); | ||
|
|
||
| // Find models by name | ||
| const whisperModels = Object.values(MODEL_REGISTRY.ALL_MODELS).filter((m) => | ||
| m.modelName.includes('whisper') | ||
| ); | ||
| ``` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -976,3 +976,129 @@ export const FSMN_VAD = { | |
| modelName: 'fsmn-vad', | ||
| modelSource: FSMN_VAD_MODEL, | ||
| } as const; | ||
|
|
||
| /** | ||
| * Registry of all available model configurations. | ||
| * | ||
| * Use this to discover and enumerate all models shipped with the library. | ||
| * @example | ||
| * ```ts | ||
| * import { MODEL_REGISTRY } from 'react-native-executorch'; | ||
| * | ||
| * // List all model names | ||
| * const names = Object.values(MODEL_REGISTRY).map(m => m.modelName); | ||
| * | ||
| * // Find models by name substring | ||
| * const whisperModels = Object.values(MODEL_REGISTRY) | ||
| * .filter(m => m.modelName.includes('whisper')); | ||
| * ``` | ||
| * @category Utils | ||
| */ | ||
| export const MODEL_REGISTRY = { | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do you think an additional backend layer here would be useful? For now if a model is exported to CoreML, we silently make the user download it. And the user may wish to run the XNNPack model on both platforms for whatever reason.
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I'm fine with this structure to keep it simple for now, but I wonder if it's something we want to add in the future relases.
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think we should rather have models named
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Just like you said @chmjkb, this is simple for now, I can even remove the type classification (the main reason for this basic registry is that I need it right now to send proper model name to the database) and that might actually be the correct way to go as that would allow us to extend it later without any braking changes/backward compatibility worries.
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Agreed.
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
| ALL_MODELS: { | ||
| LLAMA3_2_3B, | ||
| LLAMA3_2_3B_QLORA, | ||
| LLAMA3_2_3B_SPINQUANT, | ||
| LLAMA3_2_1B, | ||
| LLAMA3_2_1B_QLORA, | ||
| LLAMA3_2_1B_SPINQUANT, | ||
| QWEN3_0_6B, | ||
| QWEN3_0_6B_QUANTIZED, | ||
| QWEN3_1_7B, | ||
| QWEN3_1_7B_QUANTIZED, | ||
| QWEN3_4B, | ||
| QWEN3_4B_QUANTIZED, | ||
| HAMMER2_1_0_5B, | ||
| HAMMER2_1_0_5B_QUANTIZED, | ||
| HAMMER2_1_1_5B, | ||
| HAMMER2_1_1_5B_QUANTIZED, | ||
| HAMMER2_1_3B, | ||
| HAMMER2_1_3B_QUANTIZED, | ||
| SMOLLM2_1_135M, | ||
| SMOLLM2_1_135M_QUANTIZED, | ||
| SMOLLM2_1_360M, | ||
| SMOLLM2_1_360M_QUANTIZED, | ||
| SMOLLM2_1_1_7B, | ||
| SMOLLM2_1_1_7B_QUANTIZED, | ||
| QWEN2_5_0_5B, | ||
| QWEN2_5_0_5B_QUANTIZED, | ||
| QWEN2_5_1_5B, | ||
| QWEN2_5_1_5B_QUANTIZED, | ||
| QWEN2_5_3B, | ||
| QWEN2_5_3B_QUANTIZED, | ||
| PHI_4_MINI_4B, | ||
| PHI_4_MINI_4B_QUANTIZED, | ||
| LFM2_5_1_2B_INSTRUCT, | ||
| LFM2_5_1_2B_INSTRUCT_QUANTIZED, | ||
| LFM2_VL_1_6B_QUANTIZED, | ||
| EFFICIENTNET_V2_S, | ||
| EFFICIENTNET_V2_S_QUANTIZED, | ||
| SSDLITE_320_MOBILENET_V3_LARGE, | ||
| RF_DETR_NANO, | ||
| STYLE_TRANSFER_CANDY, | ||
| STYLE_TRANSFER_CANDY_QUANTIZED, | ||
| STYLE_TRANSFER_MOSAIC, | ||
| STYLE_TRANSFER_MOSAIC_QUANTIZED, | ||
| STYLE_TRANSFER_RAIN_PRINCESS, | ||
| STYLE_TRANSFER_RAIN_PRINCESS_QUANTIZED, | ||
| STYLE_TRANSFER_UDNIE, | ||
| STYLE_TRANSFER_UDNIE_QUANTIZED, | ||
| WHISPER_TINY_EN, | ||
| WHISPER_TINY_EN_QUANTIZED, | ||
| WHISPER_BASE_EN, | ||
| WHISPER_BASE_EN_QUANTIZED, | ||
| WHISPER_SMALL_EN, | ||
| WHISPER_SMALL_EN_QUANTIZED, | ||
| WHISPER_TINY, | ||
| WHISPER_BASE, | ||
| WHISPER_SMALL, | ||
| DEEPLAB_V3_RESNET50, | ||
| DEEPLAB_V3_RESNET101, | ||
| DEEPLAB_V3_MOBILENET_V3_LARGE, | ||
| LRASPP_MOBILENET_V3_LARGE, | ||
| FCN_RESNET50, | ||
| FCN_RESNET101, | ||
| DEEPLAB_V3_RESNET50_QUANTIZED, | ||
| DEEPLAB_V3_RESNET101_QUANTIZED, | ||
| DEEPLAB_V3_MOBILENET_V3_LARGE_QUANTIZED, | ||
| LRASPP_MOBILENET_V3_LARGE_QUANTIZED, | ||
| FCN_RESNET50_QUANTIZED, | ||
| FCN_RESNET101_QUANTIZED, | ||
| SELFIE_SEGMENTATION, | ||
| YOLO26N_SEG, | ||
| YOLO26S_SEG, | ||
| YOLO26M_SEG, | ||
| YOLO26L_SEG, | ||
| YOLO26X_SEG, | ||
| RF_DETR_NANO_SEG, | ||
| CLIP_VIT_BASE_PATCH32_IMAGE, | ||
| CLIP_VIT_BASE_PATCH32_IMAGE_QUANTIZED, | ||
| ALL_MINILM_L6_V2, | ||
| ALL_MPNET_BASE_V2, | ||
| MULTI_QA_MINILM_L6_COS_V1, | ||
| MULTI_QA_MPNET_BASE_DOT_V1, | ||
| CLIP_VIT_BASE_PATCH32_TEXT, | ||
| BK_SDM_TINY_VPRED_512, | ||
| BK_SDM_TINY_VPRED_256, | ||
| FSMN_VAD, | ||
| }, | ||
| } as const; | ||
|
|
||
| const urlToModelName = new Map<string, string>(); | ||
| for (const config of Object.values(MODEL_REGISTRY.ALL_MODELS)) { | ||
| const modelName = config.modelName; | ||
| for (const [key, value] of Object.entries(config)) { | ||
| if (key !== 'modelName' && typeof value === 'string') { | ||
| urlToModelName.set(value, modelName); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Looks up the model name for a given source URL. | ||
| * @param url - The source URL to look up. | ||
| * @returns The model name if found, otherwise undefined. | ||
| */ | ||
| export function getModelNameForUrl(url: string): string | undefined { | ||
| return urlToModelName.get(url); | ||
| } | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| export const DOWNLOAD_EVENT_ENDPOINT = | ||
| 'https://ai.swmansion.com/telemetry/downloads/api/downloads'; |
Uh oh!
There was an error while loading. Please reload this page.