Commit 2d868d3
authored
Performant layerwise calibration for large models (#1251)
## Summary
Adds **performant layerwise calibration** for quantizing large models
(e.g. DeepSeek-R1 671B) that don't fit entirely on GPU. ([Example
commands](#example-commands))
1. **Performant calibration for large models** — Each decoder layer is
moved from CPU/disk to GPU (accelerate) or unsharded (FSDP2) **only
once** and kept on GPU for the entire calibration step. Previously,
every calibration batch triggered weight transfer for every layer —
O(num_batches) weight movements per layer. Now it is O(1) per layer.
This also means you can **increase batch size** since only one layer's
weights occupy GPU at a time — e.g. DeepSeek-R1 on a single node
(8×80GB) with `batch_size=16` and `gpu_max_mem_percentage=0.5`.
2. **Checkpoint save/resume** — Saves progress after each layer, so jobs
that exceed cluster time limits (e.g. 4-hour Slurm windows for 100+
layer MoE models) can resume from the last completed layer.
3. **Rename** `sequential_calibrate` → `layerwise_calibrate` for
clarity.
### Design details
The existing layerwise state machine (skip/run/capture) already
processes one layer at a time, but skip-mode layers still kept their
parameters in the ModuleList — so frameworks transferred all weights
every forward pass. This PR adds:
- **`_SkipLayer`**: replaces fully-calibrated layers with a
parameter-free dummy in the ModuleList, so framework hooks have nothing
to transfer
- **`persistent_materialization`**: keeps the active layer on GPU for
the entire calibration step, avoiding repeated offload/reload cycles
Checkpoint save is per-layer; restore is bulk — quantizer state and
weights for layers 0..K-1 are restored once at the end of calibration,
keeping the hot path fast.
### Example commands
**Qwen3-8B** (NVFP4+GPTQ, single GPU):
```bash
python hf_ptq.py \
--pyt_ckpt_path Qwen/Qwen3-8B \
--recipe nvfp4_gptq_sequential.yaml \
--calib_size 64 \
--batch_size 16 \
--dataset cnn_dailymail \
--export_path outputs/qwen3_8b_nvfp4_gptq_seq \
--gpu_max_mem_percentage 0.5 \
--use_seq_device_map \
--vllm_fakequant_export
```
**DeepSeek-R1** (NVFP4 experts-only + FP8 KV, 8×80GB):
```bash
python hf_ptq.py \
--model unsloth/DeepSeek-R1-0528-BF16 \
--recipe ../../modelopt_recipes/general/ptq/nvfp4_experts_only-fp8_kv.yaml \
--dataset cnn_dailymail \
--batch_size 16 \
--calib_size 64 \
--calib_seq 512 \
--gpu_max_mem_percentage 0.5 \
--use_seq_device_map \
--trust_remote_code \
--export_path output/DeepSeek-R1-BF16-nvfp4-experts-only-fp8-kv \
--vllm_fakequant_export
```
### Example: NVFP4+GPTQ layerwise calibration on Qwen3-8B (36 layers,
single GPU — 20 GB peak)
**Initial run** (killed after layer 11):
```
Layerwise calibration: Found 36 transformer layers
Calibrating layer 1/36 | capture: [1]
Computing Hessians for 7 linear layers...
GPTQ time: 51.39s
Calibrating layer 2/36 | run: [1] | capture: [2]
Checkpoint: saved layer 0
GPTQ time: 50.06s
Calibrating layer 3/36 | skip: 1 | run: [2] | capture: [3]
Checkpoint: saved layer 1
...
Calibrating layer 12/36 | skip: 10 | run: [11] | capture: [12]
Checkpoint: saved layer 10
<killed>
```
**Resumed run** (picks up from layer 11, finishes all 36):
```
Layerwise calibration: Found 36 transformer layers
Checkpoint: resuming layerwise calibration from layer 11/36
Calibrating layer 12 (resumed)
GPTQ time: 51.45s
Calibrating layer 13/36 | skip: 11 | run: [12] | capture: [13]
Checkpoint: saved layer 11
...
Calibrating layer 36/36 | skip: 34 | run: [35] | capture: [36]
Checkpoint: saved layer 34
GPTQ time: 50.33s
Checkpoint: saved layer 35 (final)
Checkpoint: restored 11 previously calibrated layers
Layerwise calibration completed
Quantized model exported to: outputs/qwen3_8b_nvfp4_gptq_seq
GPU 0: Peak memory usage = 20.42 GB
```
## TODO
- [ ] Update CHANGELOG
## Test plan
- `tests/unit/torch/quantization/test_layerwise_calibrate.py` — unit
tests for skip/swap/restore
- `tests/unit/torch/quantization/test_sequential_checkpoint.py` —
checkpoint save/resume correctness
- `tests/gpu/torch/quantization/plugins/test_accelerate_gpu.py` —
CPU-offloaded layerwise + GPTQ + checkpoint resume
- `tests/gpu/torch/quantization/test_fsdp2.py` — FSDP2 layerwise
calibration
### Verified
- [x] Qwen3-8B: layerwise calibration + checkpoint save/restore +
fakequantized checkpoint export + vLLM serve
- [x] DeepSeek-R1: checkpoint resume tested
- [x] DeepSeek-R1: fakequantized checkpoint export verified
---------
Signed-off-by: realAsma <akuriparambi@nvidia.com>1 parent dc7ad66 commit 2d868d3
File tree
29 files changed
+2467
-582
lines changed- examples/llm_ptq
- modelopt_recipes/general/ptq
- modelopt/torch
- export/plugins
- quantization
- plugins
- utils
- utils
- tests
- gpu/torch
- export
- quantization
- plugins
- unit/torch/quantization
- plugins
29 files changed
+2467
-582
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
15 | 15 | | |
16 | 16 | | |
17 | 17 | | |
| 18 | + | |
18 | 19 | | |
19 | 20 | | |
20 | 21 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
15 | 15 | | |
16 | 16 | | |
17 | 17 | | |
| 18 | + | |
18 | 19 | | |
19 | 20 | | |
20 | 21 | | |
| |||
854 | 855 | | |
855 | 856 | | |
856 | 857 | | |
| 858 | + | |
| 859 | + | |
| 860 | + | |
| 861 | + | |
| 862 | + | |
| 863 | + | |
| 864 | + | |
| 865 | + | |
| 866 | + | |
| 867 | + | |
| 868 | + | |
| 869 | + | |
| 870 | + | |
| 871 | + | |
| 872 | + | |
| 873 | + | |
| 874 | + | |
| 875 | + | |
| 876 | + | |
| 877 | + | |
| 878 | + | |
| 879 | + | |
| 880 | + | |
| 881 | + | |
| 882 | + | |
| 883 | + | |
| 884 | + | |
| 885 | + | |
| 886 | + | |
| 887 | + | |
| 888 | + | |
| 889 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
34 | 34 | | |
35 | 35 | | |
36 | 36 | | |
| 37 | + | |
| 38 | + | |
37 | 39 | | |
38 | 40 | | |
39 | 41 | | |
| |||
91 | 93 | | |
92 | 94 | | |
93 | 95 | | |
94 | | - | |
95 | | - | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
96 | 99 | | |
97 | 100 | | |
98 | 101 | | |
| |||
760 | 763 | | |
761 | 764 | | |
762 | 765 | | |
763 | | - | |
| 766 | + | |
| 767 | + | |
| 768 | + | |
764 | 769 | | |
765 | 770 | | |
766 | 771 | | |
| |||
1105 | 1110 | | |
1106 | 1111 | | |
1107 | 1112 | | |
| 1113 | + | |
| 1114 | + | |
| 1115 | + | |
| 1116 | + | |
| 1117 | + | |
| 1118 | + | |
1108 | 1119 | | |
1109 | 1120 | | |
1110 | 1121 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
24 | 24 | | |
25 | 25 | | |
26 | 26 | | |
| 27 | + | |
| 28 | + | |
27 | 29 | | |
28 | 30 | | |
29 | 31 | | |
| |||
38 | 40 | | |
39 | 41 | | |
40 | 42 | | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
| 98 | + | |
| 99 | + | |
| 100 | + | |
| 101 | + | |
| 102 | + | |
| 103 | + | |
| 104 | + | |
| 105 | + | |
| 106 | + | |
| 107 | + | |
41 | 108 | | |
42 | 109 | | |
43 | 110 | | |
| 111 | + | |
44 | 112 | | |
45 | 113 | | |
46 | 114 | | |
| |||
53 | 121 | | |
54 | 122 | | |
55 | 123 | | |
| 124 | + | |
| 125 | + | |
| 126 | + | |
| 127 | + | |
56 | 128 | | |
57 | 129 | | |
58 | 130 | | |
59 | 131 | | |
60 | 132 | | |
61 | | - | |
62 | | - | |
63 | | - | |
64 | | - | |
65 | 133 | | |
66 | | - | |
67 | | - | |
68 | | - | |
| 134 | + | |
69 | 135 | | |
70 | | - | |
71 | | - | |
72 | | - | |
73 | | - | |
74 | | - | |
75 | | - | |
76 | | - | |
77 | | - | |
78 | | - | |
79 | | - | |
| 136 | + | |
| 137 | + | |
| 138 | + | |
| 139 | + | |
| 140 | + | |
| 141 | + | |
| 142 | + | |
| 143 | + | |
80 | 144 | | |
81 | | - | |
82 | | - | |
83 | | - | |
84 | | - | |
85 | | - | |
86 | | - | |
87 | | - | |
88 | | - | |
89 | | - | |
90 | | - | |
91 | | - | |
92 | | - | |
93 | | - | |
94 | | - | |
95 | | - | |
96 | | - | |
97 | | - | |
98 | | - | |
99 | | - | |
100 | | - | |
101 | | - | |
102 | | - | |
103 | | - | |
104 | | - | |
105 | | - | |
106 | | - | |
107 | | - | |
108 | | - | |
109 | | - | |
110 | | - | |
111 | | - | |
| 145 | + | |
| 146 | + | |
| 147 | + | |
| 148 | + | |
| 149 | + | |
| 150 | + | |
| 151 | + | |
| 152 | + | |
| 153 | + | |
| 154 | + | |
| 155 | + | |
| 156 | + | |
| 157 | + | |
| 158 | + | |
| 159 | + | |
| 160 | + | |
| 161 | + | |
| 162 | + | |
| 163 | + | |
| 164 | + | |
| 165 | + | |
| 166 | + | |
| 167 | + | |
| 168 | + | |
| 169 | + | |
| 170 | + | |
| 171 | + | |
| 172 | + | |
| 173 | + | |
| 174 | + | |
| 175 | + | |
| 176 | + | |
| 177 | + | |
| 178 | + | |
| 179 | + | |
| 180 | + | |
| 181 | + | |
| 182 | + | |
| 183 | + | |
112 | 184 | | |
113 | 185 | | |
114 | 186 | | |
| |||
161 | 233 | | |
162 | 234 | | |
163 | 235 | | |
164 | | - | |
165 | | - | |
166 | | - | |
167 | | - | |
168 | | - | |
169 | | - | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1217 | 1217 | | |
1218 | 1218 | | |
1219 | 1219 | | |
1220 | | - | |
| 1220 | + | |
1221 | 1221 | | |
1222 | | - | |
| 1222 | + | |
1223 | 1223 | | |
1224 | | - | |
1225 | | - | |
| 1224 | + | |
| 1225 | + | |
1226 | 1226 | | |
1227 | 1227 | | |
1228 | 1228 | | |
1229 | 1229 | | |
| 1230 | + | |
| 1231 | + | |
| 1232 | + | |
| 1233 | + | |
| 1234 | + | |
| 1235 | + | |
| 1236 | + | |
| 1237 | + | |
| 1238 | + | |
| 1239 | + | |
| 1240 | + | |
| 1241 | + | |
| 1242 | + | |
| 1243 | + | |
| 1244 | + | |
| 1245 | + | |
| 1246 | + | |
| 1247 | + | |
| 1248 | + | |
| 1249 | + | |
1230 | 1250 | | |
1231 | 1251 | | |
1232 | 1252 | | |
| |||
0 commit comments