Skip to content

Commit 8c9c512

Browse files
committed
fix: incorrect check on fast moe activation
Signed-off-by: Mehant Kammakomati <mehant.kammakomati2@ibm.com>
1 parent 3c91290 commit 8c9c512

1 file changed

Lines changed: 3 additions & 2 deletions

File tree

tuning/sft_trainer.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -167,8 +167,9 @@ def train(
167167
"`--padding_free` argument was called with `packing=True`, "
168168
"Trainer should not perform packing when using `--padding_free`"
169169
)
170-
171-
if fast_moe_config is not None:
170+
if fast_moe_config is not None and fast_moe_config.fast_moe is None:
171+
fast_moe_config = None
172+
if fast_moe_config is not None and fast_moe_config.fast_moe is not None:
172173
# Checking for unsupported modules with Scatter MoE for LoRA
173174
# Only raise an error for `all-linear`
174175
restricted_modules = ["all-linear"]

0 commit comments

Comments
 (0)