Skip to content

Commit 1759a2f

Browse files
committed
add comment
Signed-off-by: Will Johnson <mwjohnson728@gmail.com>
1 parent 4e8d774 commit 1759a2f

1 file changed

Lines changed: 3 additions & 1 deletion

File tree

tuning/sft_trainer.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -387,7 +387,9 @@ def train(
387387
model, (peft_config,) = framework.augmentation(
388388
model, train_args, modifiable_args=(peft_config,)
389389
)
390-
# For LoRa ScatterMoE, disable grad for ScatterMoE
390+
# HACK - For LoRa ScatterMoE, disable grad for ScatterMoE.
391+
# In the future, requires_grad should be enabled for LoRA tuning
392+
# with ScatterMoE and this code should be removed.
391393
if peft_config is not None:
392394
for module in model.modules():
393395
# Use string comparison to check if ScatterMoE module

0 commit comments

Comments
 (0)