Skip to content

Commit c35b81f

Browse files
committed
fix: incorrect check on fast moe activation
Signed-off-by: Mehant Kammakomati <mehant.kammakomati2@ibm.com>
1 parent 3c91290 commit c35b81f

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

tuning/sft_trainer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -168,7 +168,7 @@ def train(
168168
"Trainer should not perform packing when using `--padding_free`"
169169
)
170170

171-
if fast_moe_config is not None:
171+
if fast_moe_config is not None and fast_moe_config.fast_moe is not None:
172172
# Checking for unsupported modules with Scatter MoE for LoRA
173173
# Only raise an error for `all-linear`
174174
restricted_modules = ["all-linear"]

0 commit comments

Comments
 (0)