|
| 1 | +--- |
| 2 | +title: "pip install fails during template build" |
| 3 | +sidebarTitle: pip install fails during template build |
| 4 | +--- |
| 5 | + |
| 6 | +When building a template that installs large Python packages like PyTorch with CUDA dependencies, `pip install` can fail with one of two errors: |
| 7 | + |
| 8 | +- **`MemoryError`** — pip tries to serialize large wheel files into memory for caching, exceeding available RAM. |
| 9 | +- **`OSError: [Errno 28] No space left on device`** — downloaded wheels fill up the `/tmp` directory. |
| 10 | + |
| 11 | +**Example errors** |
| 12 | + |
| 13 | +```txt |
| 14 | +File "pip/_vendor/cachecontrol/serialize.py", line 70, in dumps |
| 15 | + return b",".join([b"cc=4", msgpack.dumps(data, use_bin_type=True)]) |
| 16 | +MemoryError |
| 17 | +``` |
| 18 | + |
| 19 | +```txt |
| 20 | +ERROR: Could not install packages due to an OSError: [Errno 28] No space left on device |
| 21 | +``` |
| 22 | + |
| 23 | +## Cause |
| 24 | + |
| 25 | +The build environment mounts `/tmp` as a [tmpfs](https://www.kernel.org/doc/html/latest/filesystems/tmpfs.html) — a RAM-backed filesystem capped at ~3.9 GB. pip downloads all wheels to `/tmp/pip-*` before installing them. PyTorch with CUDA dependencies totals ~4.1 GB of downloads, which exceeds this limit. |
| 26 | + |
| 27 | +## Solution 1: Redirect pip's temp directory to disk (recommended) |
| 28 | + |
| 29 | +Set the `TMPDIR` environment variable to a disk-backed path so pip downloads don't go through the RAM-backed `/tmp`. Combined with `--no-cache-dir`, this avoids both the disk space and memory issues. |
| 30 | + |
| 31 | +<CodeGroup> |
| 32 | +```typescript JavaScript & TypeScript |
| 33 | +const template = Template() |
| 34 | + .runCmd( |
| 35 | + 'TMPDIR=/var/tmp pip install --no-cache-dir torch sentence-transformers', |
| 36 | + { user: 'root' } |
| 37 | + ) |
| 38 | +``` |
| 39 | +```python Python |
| 40 | +template = ( |
| 41 | + Template() |
| 42 | + .run_cmd( |
| 43 | + "TMPDIR=/var/tmp pip install --no-cache-dir torch sentence-transformers", |
| 44 | + user="root", |
| 45 | + ) |
| 46 | +) |
| 47 | +``` |
| 48 | +</CodeGroup> |
| 49 | + |
| 50 | +## Solution 2: Install CPU-only PyTorch |
| 51 | + |
| 52 | +E2B sandboxes don't have GPUs, so there's no reason to download CUDA dependencies. Installing the CPU-only variant of PyTorch reduces the download from ~4.1 GB to ~189 MB, avoiding the `/tmp` size limit entirely. |
| 53 | + |
| 54 | +<CodeGroup> |
| 55 | +```typescript JavaScript & TypeScript |
| 56 | +const template = Template() |
| 57 | + .runCmd( |
| 58 | + 'pip install --no-cache-dir torch --index-url https://download.pytorch.org/whl/cpu', |
| 59 | + { user: 'root' } |
| 60 | + ) |
| 61 | + .runCmd( |
| 62 | + 'echo "torch" > /tmp/constraints.txt && pip install --no-cache-dir -c /tmp/constraints.txt sentence-transformers', |
| 63 | + { user: 'root' } |
| 64 | + ) |
| 65 | +``` |
| 66 | +```python Python |
| 67 | +template = ( |
| 68 | + Template() |
| 69 | + .run_cmd( |
| 70 | + "pip install --no-cache-dir torch --index-url https://download.pytorch.org/whl/cpu", |
| 71 | + user="root", |
| 72 | + ) |
| 73 | + .run_cmd( |
| 74 | + 'echo "torch" > /tmp/constraints.txt && pip install --no-cache-dir -c /tmp/constraints.txt sentence-transformers', |
| 75 | + user="root", |
| 76 | + ) |
| 77 | +) |
| 78 | +``` |
| 79 | +</CodeGroup> |
| 80 | + |
| 81 | +The constraints file in the second step prevents pip from replacing the CPU-only torch with the CUDA version when installing packages that depend on PyTorch. |
0 commit comments