Skip to content

Commit f8eb537

Browse files
Fix smollm2 wiring: point at SmolLM2 v2 repo and config
Agent-Logs-Url: https://github.com/pytorch/executorch/sessions/bf2a52e4-8d29-4371-8a0e-b4c5cfe98be0 Co-authored-by: kirklandsign <107070759+kirklandsign@users.noreply.github.com>
1 parent 1d3dfdc commit f8eb537

2 files changed

Lines changed: 2 additions & 2 deletions

File tree

examples/models/llama/export_llama_lib.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,7 @@
123123
"qwen2_5_1_5b": "Qwen/Qwen2.5-1.5B",
124124
"qwen2_5_coder_32b": "Qwen/Qwen2.5-Coder-32B-Instruct",
125125
"phi_4_mini": "microsoft/Phi-4-mini-instruct",
126-
"smollm2": "HuggingFaceTB/SmolLM-135M",
126+
"smollm2": "HuggingFaceTB/SmolLM2-135M",
127127
"qwen3_0_6b": "Qwen/Qwen3-0.6B",
128128
"qwen3_1_7b": "Qwen/Qwen3-1.7B",
129129
"qwen3_4b": "Qwen/Qwen3-4B",

examples/models/smollm2/135M_config.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
"n_kv_heads": 3,
77
"n_layers": 30,
88
"norm_eps": 1e-05,
9-
"rope_theta": 10000.0,
9+
"rope_theta": 100000.0,
1010
"use_scaled_rope": false,
1111
"vocab_size": 49152,
1212
"use_hf_rope": false,

0 commit comments

Comments
 (0)