Skip to content

fix optimizer number of steps#3428

Closed
entrpn wants to merge 9 commits intomainfrom
optimizer_ga_fix
Closed

fix optimizer number of steps#3428
entrpn wants to merge 9 commits intomainfrom
optimizer_ga_fix

Conversation

@entrpn
Copy link
Copy Markdown
Collaborator

@entrpn entrpn commented Mar 16, 2026

Description

Fixes the amount of optimizer steps that are set when using gradient accumulation with optax.MultiStep.

By using student_config.steps, we ensure that the maxtext optimizer, lr_scheduler and optax.MultiStep stay in sync with the difference being how many backward passes are performed based on gradient_accumulation. Ex:

  • ga=1, 5k steps, will do 5k backward passes.
  • ga=2, 5k steps, will do 5k/2 backward passes.

The number of samples seen by the model stay exactly the same.

Tests

Ran:

  • tests.unit.distillation_checkpointing_test
  • tests.unit.train_distill_test

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 16, 2026

@entrpn entrpn force-pushed the optimizer_ga_fix branch from fa1d7e7 to 735732b Compare March 17, 2026 21:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants