Skip to content

Implementation for soft offline distillation using saved top-k teacher logits#3382

Merged
copybara-service[bot] merged 9 commits intomainfrom
ajkv/offline-distillation-soft
Mar 20, 2026
Merged

Implementation for soft offline distillation using saved top-k teacher logits#3382
copybara-service[bot] merged 9 commits intomainfrom
ajkv/offline-distillation-soft

Conversation

@ajkv-google
Copy link
Copy Markdown
Collaborator

@ajkv-google ajkv-google commented Mar 11, 2026

Description

This PR introduces an end-to-end offline distillation training pipeline. Previously, the distillation loop executed in an "online" mode, which required both the frozen Teacher model and the learning Student model to be loaded and executed simultaneously during training. This change allows the trainer to load pre-computed, top-K Teacher logits from .array_record files, which allows us to bybass the forward pass for the teacher model during the training loop.

Tests

Tested this code change by running the following command (using Yaml File for Offline Distillation)

python3 src/maxtext/trainers/post_train/distillation/train_distill.py src/maxtext/configs/post_train/distillation.yml steps=100 tokenizer_path="/mnt/ajkv/disks/codebase/maxtext/src/maxtext/assets/tokenizers/tokenizer_llama3.tiktoken"

Truncated output showing the successful run: https://paste.googleplex.com/4879271282737152.

Verified that the training happened successfully and finished the distillation run.

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 11, 2026

Codecov Report

❌ Patch coverage is 45.76271% with 32 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
...ners/post_train/distillation/distillation_utils.py 28.12% 23 Missing ⚠️
.../trainers/post_train/distillation/train_distill.py 66.66% 8 Missing and 1 partial ⚠️

📢 Thoughts on this report? Let us know!

Comment thread src/maxtext/trainers/post_train/distillation/distillation_utils.py Outdated
Comment thread src/maxtext/trainers/post_train/distillation/train_distill.py Outdated
Copy link
Copy Markdown
Collaborator

@vlad-karp vlad-karp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM overall
need a new unit test for this specific path

Comment thread src/maxtext/configs/types.py Outdated
Comment thread src/maxtext/trainers/post_train/distillation/train_distill.py
@ajkv-google
Copy link
Copy Markdown
Collaborator Author

ajkv-google commented Mar 13, 2026

Added unit tests to make sure that in offline mode, only the student model is loaded, while in online mode, both the student and teacher models are loaded. Below are the commands I used to run each of the unit tests:

Test offline distillation:

  • pytest tests/post_training/unit/train_distill_test.py -k "test_main_offline_mode_skips_teacher_loading"

Test online distillation:

  • pytest tests/post_training/unit/train_distill_test.py -k "test_main_online_mode_loads_teacher"

@ajkv-google ajkv-google force-pushed the ajkv/offline-distillation-soft branch from 44c0cf6 to beb19b9 Compare March 19, 2026 22:13
@copybara-service copybara-service Bot merged commit b723c4e into main Mar 20, 2026
31 of 32 checks passed
@copybara-service copybara-service Bot deleted the ajkv/offline-distillation-soft branch March 20, 2026 17:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants