Commit c0bc069
fix: lower PyTorch minimum to 2.8.0 for vLLM compatibility (#53)
vLLM 0.11.0 pins torch==2.8.0. The GPU E2E validation (openadapt-evals
PR #87) confirmed the full ML stack works with PyTorch 2.8.0+cu128.
The previous >=2.9.1 constraint prevented installing openadapt-ml
alongside vLLM in the same environment.
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>1 parent 69412fe commit c0bc069
1 file changed
+1
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
35 | 35 | | |
36 | 36 | | |
37 | 37 | | |
38 | | - | |
| 38 | + | |
39 | 39 | | |
40 | 40 | | |
41 | 41 | | |
| |||
0 commit comments