We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent b5c66f4 commit 8a4e2d7Copy full SHA for 8a4e2d7
README.md
@@ -6,7 +6,7 @@ Pre-built wheels for llama-cpp-python across platforms and CUDA versions.
6
7
| File | OS | Python | CUDA | Driver | GPU Support | Size |
8
|------|-----|--------|------|--------|-------------|------|
9
-| [llama_cpp_python-0.3.16+cuda13.0.sm86.ampere-cp313-cp313-win_amd64.whl](../../releases) | Windows 10/11 | 3.13 | 13.0 | 580+ | RTX 30 series (Ampere, sm_86) | 61.4 MB |
+| [llama_cpp_python-0.3.16+cuda13.0.sm86.ampere-cp313-cp313-win_amd64.whl](https://github.com/DougRahden/llama-cpp-python-wheels/releases/download/v0.3.16-cuda13.0-py313/llama_cpp_python-0.3.16%2Bcuda13.0.sm86.ampere-cp313-cp313-win_amd64.whl) | Windows 10/11 | 3.13 | 13.0 | 580+ | RTX 30 series (Ampere, sm_86) | 61.4 MB |
10
11
## Installation
12
0 commit comments