Commit 51768b2
★★★ research: R41 Qwen3-Next architecture — correct hypothesis from refs + web
User called out that I'd been relying on empirical ablation without
understanding the model. Switched to research mode.
Key findings (from refs/llama.cpp/src/models/qwen35moe.cpp + qwen3next.cpp
+ vLLM blog + Qwen papers + Gated Attention NeurIPS 2025):
1. Qwen3-Next = 75% DeltaNet + 25% attention (3:1), not 1:1. For 40-layer
35B only 10 attention layers.
2. Gated Attention (NeurIPS 2025, Qwen team paper) is THE long-context
stabilizer — head-wise sigmoid(gate) on SDPA output REPLACES attention
sinks. If our engine doesn't apply this gate correctly on those 10
attention layers, long-gen drift is architecturally predicted.
Single Q projection outputs 2× dim (Q + gate), post-attn multiplies
by sigmoid(gate). See qwen35moe.cpp:156, 186-189.
3. Instruct vs Thinking are DIFFERENT checkpoints requiring DIFFERENT
chat templates. Instruct must NOT be primed with <think>. Ours always
primes empty <think>\\n\\n</think>\\n\\n — potentially OOD if our
35B is Instruct.
4. Gated-DeltaNet's known failure modes (ICLR 2025 paper): α saturation
at ~1.0, compression bottleneck of fixed-size state.
5. DRY sampler (oobabooga PR #5677) is community-standard for hybrid
loop-collapse — we don't have it.
New hypotheses, ranked:
H1: attn_output_gate missing/buggy in our self_attn_forward
H2: chat template mismatch for Instruct variant
H3: DeltaNet α saturation (R26-29 attacked, not verified beyond 200 tok)
Plan: audit attn_output_gate line-by-line vs qwen35moe.cpp:129-189
(R42), fix confirmed bug (R43), port DRY sampler as safety net (R44).
Methodology lesson: reference > introspection. Research is cheaper than
15+ empirical rounds that leave us uncertain.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>1 parent f148bde commit 51768b2
1 file changed
Lines changed: 79 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
3 | 3 | | |
4 | 4 | | |
5 | 5 | | |
| 6 | + | |
| 7 | + | |
| 8 | + | |
| 9 | + | |
| 10 | + | |
| 11 | + | |
| 12 | + | |
| 13 | + | |
| 14 | + | |
| 15 | + | |
| 16 | + | |
| 17 | + | |
| 18 | + | |
| 19 | + | |
| 20 | + | |
| 21 | + | |
| 22 | + | |
| 23 | + | |
| 24 | + | |
| 25 | + | |
| 26 | + | |
| 27 | + | |
| 28 | + | |
| 29 | + | |
| 30 | + | |
| 31 | + | |
| 32 | + | |
| 33 | + | |
| 34 | + | |
| 35 | + | |
| 36 | + | |
| 37 | + | |
| 38 | + | |
| 39 | + | |
| 40 | + | |
| 41 | + | |
| 42 | + | |
| 43 | + | |
| 44 | + | |
| 45 | + | |
| 46 | + | |
| 47 | + | |
| 48 | + | |
| 49 | + | |
| 50 | + | |
| 51 | + | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
6 | 85 | | |
7 | 86 | | |
8 | 87 | | |
| |||
0 commit comments