File tree Expand file tree Collapse file tree 1 file changed +34
-1
lines changed
Expand file tree Collapse file tree 1 file changed +34
-1
lines changed Original file line number Diff line number Diff line change 11# llama-cpp-python-wheels
2- Pre-built wheels for llama-cpp-python across platforms and CUDA versions
2+
3+ Pre-built wheels for llama-cpp-python across platforms and CUDA versions.
4+
5+ ## Available Wheels
6+
7+ | File | OS | Python | CUDA | Driver | GPU Support | Size |
8+ | ------| -----| --------| ------| --------| -------------| ------|
9+ | [ llama_cpp_python-0.3.16+cuda13.0.sm86.ampere-cp313-cp313-win_amd64.whl] ( ../../releases ) | Windows 10/11 | 3.13 | 13.0 | 580+ | RTX 30 series (Ampere, sm_86) | 61.4 MB |
10+
11+ ## Installation
12+
13+ Download wheel from [ Releases] ( ../../releases ) and install:
14+ ``` bash
15+ pip install llama_cpp_python-0.3.16+cuda13.0.sm86.ampere-cp313-cp313-win_amd64.whl
16+ ```
17+
18+ ## Verification
19+ ``` python
20+ from llama_cpp import Llama
21+ print (" llama-cpp-python with CUDA support installed successfully" )
22+ ```
23+
24+ ## Build Notes
25+
26+ Built with:
27+ - Visual Studio 2022 Build Tools
28+ - CUDA Toolkit 13.0
29+ - CMAKE_CUDA_ARCHITECTURES=86
30+
31+ ## License
32+
33+ MIT
34+
35+ Wheels are built from [ llama-cpp-python] ( https://github.com/abetlen/llama-cpp-python ) (MIT License)
You can’t perform that action at this time.
0 commit comments