Commit 01b767f
Feat: Context Parallel for Eagle3 Training (#745)
**Type of change:** New Feature <!-- Use one of the following: Bug fix,
new feature, new example, new tests, documentation. -->
**Overview:**
- Supported Context Parallel by patching torch ring attention;
- Require following libirary version for stable cp:
- torch2.8.0
- transformers5.0.0
- accelrate1.12.0
- Move to FSDP2
- Removed unused arguments in training script (`--multi_gpu`,
`fsdp_wrap_layer`)
- Bump CI container to `nvcr.io/nvidia/pytorch:25.08-py3`
<!-- You can potentially add a usage example below. -->
```bash
./launch_train.sh --model $MODEL \
--output_dir $OUTPUT_DIR \
--data $DATA \
--num_epochs 0.1 \
--train_bs 1 \
--eagle_config eagle_config.json \
--training_seq_len 1024 \
--cp_size 2 #newly added
```
- SDPA level correctness: tested TTT attention with/without CP, diff <
1%
```
=== Compare context-parallel (CP) outputs and grads with non-CP ===
Forward output comparison (CP vs Non-CP):
Absolute diff (adiff) cp_out vs out: 0.001953125
Relative diff (rdiff) cp_out vs out: 0.00182342529296875
WQ (query proj) grad comparison (CP vs Non-CP):
Absolute diff (adiff) cp_wq_grad vs wq_grad: 0.0078125
Relative diff (rdiff) cp_wq_grad vs wq_grad: 0.00347900390625
WK (key proj) grad comparison (CP vs Non-CP):
Absolute diff (adiff) cp_wk_grad vs wk_grad: 0.0078125
Relative diff (rdiff) cp_wk_grad vs wk_grad: 0.002471923828125
WV (value proj) grad comparison (CP vs Non-CP):
Absolute diff (adiff) cp_wv_grad vs wv_grad: 0.25
Relative diff (rdiff) cp_wv_grad vs wv_grad: 0.0069580078125
==============================================================
```
- E2E Training Acc
(Llama3.1-8B, Unsynthesized magpie)
<img width="911" height="630" alt="image"
src="https://github.com/user-attachments/assets/1ecacc7f-c720-494c-9c1b-b60e7ced7baa"
/>
- Peak Mem Reserved
(llama3.1-8B, 8xH100, train_length=4k)
| cp_size | max_memory_allocated(MB) |max_memory_reserved (MB) |
|----|--------------------------|--------------------------|
| 1 | 65040.20 |79018.00
| 2 | 50409.17 |73098.00
| 4 | 45120.92 |72052.00
| 8 | 38882.12 |66484.00
- Max Training Length test
(llama3.1-8B, H100)
| cp_size | 6k | 12k | 24k | 48k |
|--------------------|-----|-----|-----|-----|
| 1 | ✅ | OOM | OOM | OOM |
|2 | ✅ | ✅ | OOM | OOM |
| 4 | ✅ | ✅ | ✅ | OOM |
| 8 | ✅ | ✅ | ✅ | ✅ |
<!-- If you haven't finished some of the above items you can still open
`Draft` PR. -->
- **Make sure you read and follow [Contributor
guidelines](https://github.com/NVIDIA/Model-Optimizer/blob/main/CONTRIBUTING.md)**
and your commits are signed.
- **Is this change backward compatible?**: Yes/No <!--- If No, explain
why. -->
- **Did you write any new necessary tests?**: Yes/No
- **Did you add or update any necessary documentation?**: Yes/No
- **Did you update
[Changelog](https://github.com/NVIDIA/Model-Optimizer/blob/main/CHANGELOG.rst)?**:
Yes/No <!--- Only for new features, API changes, critical bug fixes or
bw breaking changes. -->
<!-- E.g. related issue. -->
<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
* **New Features**
* Added context parallelism (CP) and data parallelism shard size
configuration parameters to training arguments.
* **Enhancements**
* Improved TTT attention masking support for speculative decoding
workflows.
* Enhanced training launch script with improved parallelism
configuration handling.
* **Chores**
* Updated core dependencies: torch, transformers, accelerate, and wandb.
* Added FSDP configuration file for distributed training setup.
<sub>✏️ Tip: You can customize this high-level summary in your review
settings.</sub>
<!-- end of auto-generated comment: release notes by coderabbit.ai -->
---------
Signed-off-by: h-guo18 <67671475+h-guo18@users.noreply.github.com>1 parent 0e706a8 commit 01b767f
12 files changed
Lines changed: 466 additions & 77 deletions
File tree
- .github/workflows
- examples/speculative_decoding
- modelopt/torch/speculative
- plugins
- tests/examples/speculative_decoding
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
| 1 | + | |
| 2 | + | |
| 3 | + | |
| 4 | + | |
| 5 | + | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
| 108 | + | |
| 109 | + | |
| 110 | + | |
| 111 | + | |
| 112 | + | |
| 113 | + | |
| 114 | + | |
| 115 | + | |
| 116 | + | |
| 117 | + | |
| 118 | + | |
| 119 | + | |
| 120 | + | |
| 121 | + | |
| 122 | + | |
| 123 | + | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
| 128 | + | |
| 129 | + | |
| 130 | + | |
| 131 | + | |
| 132 | + | |
| 133 | + | |
| 134 | + | |
| 135 | + | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
| 144 | + | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
| 175 | + | |
| 176 | + | |
| 177 | + | |
| 178 | + | |
| 179 | + | |
| 180 | + | |
| 181 | + | |
| 182 | + | |
| 183 | + | |
| 184 | + | |
| 185 | + | |
| 186 | + | |
| 187 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
30 | 30 | | |
31 | 31 | | |
32 | 32 | | |
33 | | - | |
| 33 | + | |
34 | 34 | | |
35 | 35 | | |
36 | 36 | | |
| |||
56 | 56 | | |
57 | 57 | | |
58 | 58 | | |
59 | | - | |
| 59 | + | |
60 | 60 | | |
61 | 61 | | |
62 | 62 | | |
| |||
74 | 74 | | |
75 | 75 | | |
76 | 76 | | |
77 | | - | |
78 | 77 | | |
79 | 78 | | |
80 | 79 | | |
81 | 80 | | |
82 | | - | |
| 81 | + | |
83 | 82 | | |
84 | 83 | | |
85 | 84 | | |
| |||
118 | 117 | | |
119 | 118 | | |
120 | 119 | | |
121 | | - | |
122 | 120 | | |
123 | 121 | | |
124 | 122 | | |
| |||
0 commit comments