Skip to content

Commit 47a036a

Browse files
committed
fix: incorrect check on fast moe activation
Signed-off-by: Mehant Kammakomati <mehant.kammakomati2@ibm.com>
1 parent 3c91290 commit 47a036a

1 file changed

Lines changed: 2 additions & 1 deletion

File tree

tuning/sft_trainer.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -167,7 +167,8 @@ def train(
167167
"`--padding_free` argument was called with `packing=True`, "
168168
"Trainer should not perform packing when using `--padding_free`"
169169
)
170-
170+
if fast_moe_config is not None and fast_moe_config.fast_moe is None:
171+
fast_moe_config = None
171172
if fast_moe_config is not None:
172173
# Checking for unsupported modules with Scatter MoE for LoRA
173174
# Only raise an error for `all-linear`

0 commit comments

Comments
 (0)