Skip to content

Commit 634f858

Browse files
kajalj22claude
andcommitted
Add onnxscript to mcore extra dependencies
The removed Megatron-LM-workspace/setup.py directly listed onnxscript as a dependency. With that workspace gone, onnxscript is only available under megatron-core[dev] and megatron-core[lts] extras but the mcore extra installs plain megatron-core. transformer_engine imports onnxscript at module load time, causing --mcore-only tests to fail with ModuleNotFoundError. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> Signed-off-by: Kajal Jain <kajalj@nvidia.com>
1 parent 8e7eca4 commit 634f858

2 files changed

Lines changed: 10 additions & 7 deletions

File tree

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -123,6 +123,7 @@ mcore = [
123123
"transformer-engine[pytorch,core_cu13] @ git+https://github.com/NVIDIA/TransformerEngine.git@v2.14.1",
124124
"megatron-core",
125125
"megatron-bridge",
126+
"onnxscript",
126127
# Flash-attn version should be selected to satisfy both TE + vLLM requirements (xformers in particular)
127128
# https://github.com/NVIDIA/TransformerEngine/blob/v2.3/transformer_engine/pytorch/attention/dot_product_attention/utils.py#L108
128129
# https://github.com/facebookresearch/xformers/blob/8354497deb2c04c67fbb2e2ad911e86530da0e90/xformers/ops/fmha/flash.py#L76

uv.lock

Lines changed: 9 additions & 7 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)