Skip to content

Fix smollm2 alias to point at SmolLM2-135M (v2) instead of SmolLM-135M (v1)#18859

Merged
kirklandsign merged 2 commits intomainfrom
copilot/fix-smollm2-alias-pointing
Apr 24, 2026
Merged

Fix smollm2 alias to point at SmolLM2-135M (v2) instead of SmolLM-135M (v1)#18859
kirklandsign merged 2 commits intomainfrom
copilot/fix-smollm2-alias-pointing

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Apr 13, 2026

Summary

The original SmolLM2 PR (#9354) started as v1 support, was renamed to smollm2 during review, but the repo ID and rope_theta were never updated to v2 values. The two checkpoints are genuinely different models (0/272 tensors match).

  • HUGGING_FACE_REPO_IDS["smollm2"]: HuggingFaceTB/SmolLM-135MHuggingFaceTB/SmolLM2-135M
  • examples/models/smollm2/135M_config.json: rope_theta 10000.0100000.0 (matches SmolLM2-135M HF config)

Test plan

Data-only change (one string, one number). Verified values match the upstream HuggingFace SmolLM2-135M config.

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Apr 13, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18859

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

✅ You can merge normally! (2 Unrelated Failures)

As of commit f8eb537 with merge base fe71bd4 (image):

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla Bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 13, 2026
@github-actions
Copy link
Copy Markdown

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Agent-Logs-Url: https://github.com/pytorch/executorch/sessions/bf2a52e4-8d29-4371-8a0e-b4c5cfe98be0

Co-authored-by: kirklandsign <107070759+kirklandsign@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix smollm2 alias pointing to correct model Fix smollm2 alias to point at SmolLM2-135M (v2) instead of SmolLM-135M (v1) Apr 13, 2026
Copilot AI requested a review from kirklandsign April 13, 2026 23:52
@kirklandsign kirklandsign marked this pull request as ready for review April 24, 2026 00:13
@kirklandsign kirklandsign requested a review from lucylq as a code owner April 24, 2026 00:13
Copilot AI review requested due to automatic review settings April 24, 2026 00:13
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR corrects the smollm2 model aliasing in the LLaMA export tooling so it points to the SmolLM2 v2 checkpoint and uses the matching RoPE theta value, aligning ExecuTorch’s built-in smollm2 configuration with the upstream HuggingFace SmolLM2-135M settings.

Changes:

  • Update HUGGING_FACE_REPO_IDS["smollm2"] to HuggingFaceTB/SmolLM2-135M.
  • Update SmolLM2 135M example config rope_theta to 100000.0.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
examples/models/smollm2/135M_config.json Updates RoPE theta to match SmolLM2-135M upstream config.
examples/models/llama/export_llama_lib.py Fixes smollm2 HF repo ID used for auto-download/convert when no checkpoint is provided.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@kirklandsign
Copy link
Copy Markdown
Contributor

@claude review

@claude
Copy link
Copy Markdown

claude Bot commented Apr 24, 2026

Claude finished @kirklandsign's task in 49s —— View job


Review of PR #18859

  • Read changed files and diff
  • Verify changes against upstream HuggingFace config
  • Post review feedback

LGTM. This is a straightforward data-only fix and the changes are correct.

I verified both values against the upstream HuggingFace configs:

Field Before (SmolLM v1) After (SmolLM2 v2) Upstream SmolLM2-135M
Repo ID HuggingFaceTB/SmolLM-135M HuggingFaceTB/SmolLM2-135M ✅ matches
rope_theta 10000.0 100000.0 ✅ matches

All other config values in 135M_config.json (dim: 576, n_heads: 9, n_kv_heads: 3, n_layers: 30, vocab_size: 49152, etc.) already match the upstream SmolLM2-135M config, so no further changes are needed.

No issues found.


@kirklandsign kirklandsign merged commit 75b31bb into main Apr 24, 2026
170 of 172 checks passed
@kirklandsign kirklandsign deleted the copilot/fix-smollm2-alias-pointing branch April 24, 2026 00:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Should "smollm2" alias point at HuggingFaceTB/SmolLM-135M (v1) or SmolLM2-135M?

4 participants