Skip to content

Commit c0bc069

Browse files
abrichrclaude
andauthored
fix: lower PyTorch minimum to 2.8.0 for vLLM compatibility (#53)
vLLM 0.11.0 pins torch==2.8.0. The GPU E2E validation (openadapt-evals PR #87) confirmed the full ML stack works with PyTorch 2.8.0+cu128. The previous >=2.9.1 constraint prevented installing openadapt-ml alongside vLLM in the same environment. Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
1 parent 69412fe commit c0bc069

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ dependencies = [
3535
"pydantic-settings>=2.0.0",
3636
"pytest>=9.0.2",
3737
"pyyaml>=6.0.3",
38-
"torch>=2.9.1",
38+
"torch>=2.8.0",
3939
"torchvision>=0.24.1",
4040
"transformers>=4.57.3",
4141
]

0 commit comments

Comments
 (0)