Skip to content

Commit daaedaa

Browse files
ericspodpre-commit-ci[bot]coderabbitai[bot]
authored
Docker Slim Image (#8640)
### Description This is an attempt to create a slim Docker image which is smaller than the current one to avoid running out of space during testing. Various fixes have been included to account for test fails within the image. These appear to be all real issues that need to be addressed (eg. ONNX export) or fixes that should be integrated either way. ~~This excludes PyTorch 2.9 from the requirements for now to avoid legacy issues with ONNX, Torchscript, and other things. MONAI needs to be updated for PyTorch 2.9 support, specifically dropping the use of Torchscript in places as it's becoming obsolete in place of `torch.export`.~~ Some tests fail without enough shared memory, the command I'm using to run with is `docker run -ti --rm --gpus '"device=0,1"' --shm-size=10gb -v $(pwd)/tests:/opt/monai/tests monai_slim /bin/bash` to tests with GPUs 0 and 1. ### Types of changes <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [x] Non-breaking change (fix or new feature that would not break existing functionality). - [ ] Breaking change (fix or new feature that would cause existing functionality to change). - [ ] New tests added to cover the changes. - [x] Integration tests passed locally by running `./runtests.sh -f -u --net --coverage`. - [ ] Quick tests passed locally by running `./runtests.sh --quick --unittests --disttests`. - [ ] In-line docstrings updated. - [ ] Documentation updated, tested `make html` command in the `docs/` folder. --------- Signed-off-by: Eric Kerfoot <eric.kerfoot@kcl.ac.uk> Signed-off-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com> Signed-off-by: Eric Kerfoot <eric.kerfoot@gmail.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
1 parent 366df44 commit daaedaa

File tree

12 files changed

+147
-40
lines changed

12 files changed

+147
-40
lines changed

.dockerignore

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,15 @@
33
__pycache__/
44
docs/
55

6+
.vscode
7+
.git
8+
.mypy_cache
9+
.ruff_cache
10+
.pytype
611
.coverage
712
.coverage.*
813
.coverage/
914
coverage.xml
1015
.readthedocs.yml
11-
*.toml
1216

1317
!README.md

CONTRIBUTING.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -380,7 +380,8 @@ All code review comments should be specific, constructive, and actionable.
380380

381381
### Release a new version
382382

383-
The `dev` branch's `HEAD` always corresponds to MONAI docker image's latest tag: `projectmonai/monai:latest`.
383+
The `dev` branch's `HEAD` always corresponds to MONAI Docker image's latest tag: `projectmonai/monai:latest`. (No
384+
release is currently done for the slim MONAI image, this is built locally by users.)
384385
The `main` branch's `HEAD` always corresponds to the latest MONAI milestone release.
385386

386387
When major features are ready for a milestone, to prepare for a new release:

Dockerfile.slim

Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
# This is a slimmed down version of the MONAI Docker image using a smaller base image and multi-stage building. Not all
13+
# NVIDIA tools will be present but all libraries and compiled code are included. This image isn't provided through
14+
# Dockerhub so users must build locally: `docker build -t monai_slim -f Dockerfile.slim .`
15+
# Containers may require more shared memory, eg.: `docker run -ti --rm --gpus all --shm-size=10gb monai_slim /bin/bash`
16+
17+
ARG IMAGE=debian:12-slim
18+
19+
FROM ${IMAGE} AS build
20+
21+
ARG TORCH_CUDA_ARCH_LIST="7.5 8.0 8.6 8.9 9.0+PTX"
22+
23+
ENV DEBIAN_FRONTEND=noninteractive
24+
ENV APT_INSTALL="apt install -y --no-install-recommends"
25+
26+
RUN apt update && apt upgrade -y && \
27+
${APT_INSTALL} ca-certificates python3-pip python-is-python3 git wget libopenslide0 unzip python3-dev && \
28+
wget https://developer.download.nvidia.com/compute/cuda/repos/debian12/x86_64/cuda-keyring_1.1-1_all.deb && \
29+
dpkg -i cuda-keyring_1.1-1_all.deb && \
30+
apt update && \
31+
${APT_INSTALL} cuda-toolkit-12 && \
32+
rm -rf /usr/lib/python*/EXTERNALLY-MANAGED /var/lib/apt/lists/* && \
33+
python -m pip install --upgrade --no-cache-dir --no-build-isolation pip
34+
35+
# TODO: remark for issue [revise the dockerfile](https://github.com/zarr-developers/numcodecs/issues/431)
36+
RUN if [[ $(uname -m) =~ "aarch64" ]]; then \
37+
CFLAGS="-O3" DISABLE_NUMCODECS_SSE2=true DISABLE_NUMCODECS_AVX2=true python -m pip install numcodecs; \
38+
fi
39+
40+
# NGC Client
41+
WORKDIR /opt/tools
42+
ARG NGC_CLI_URI="https://ngc.nvidia.com/downloads/ngccli_linux.zip"
43+
RUN wget -q ${NGC_CLI_URI} && unzip ngccli_linux.zip && chmod u+x ngc-cli/ngc && \
44+
find ngc-cli/ -type f -exec md5sum {} + | LC_ALL=C sort | md5sum -c ngc-cli.md5 && \
45+
rm -rf ngccli_linux.zip ngc-cli.md5
46+
47+
WORKDIR /opt/monai
48+
49+
# copy relevant parts of repo
50+
COPY requirements.txt requirements-min.txt requirements-dev.txt versioneer.py setup.py setup.cfg pyproject.toml ./
51+
COPY LICENSE CHANGELOG.md CODE_OF_CONDUCT.md CONTRIBUTING.md README.md MANIFEST.in runtests.sh ./
52+
COPY tests ./tests
53+
COPY monai ./monai
54+
55+
# install full deps
56+
RUN python -m pip install --no-cache-dir --no-build-isolation -r requirements-dev.txt
57+
58+
# compile ext
59+
RUN CUDA_HOME=/usr/local/cuda FORCE_CUDA=1 USE_COMPILED=1 BUILD_MONAI=1 python setup.py develop
60+
61+
# recreate the image without the installed CUDA packages then copy the installed MONAI and Python directories
62+
FROM ${IMAGE} AS build2
63+
64+
ENV DEBIAN_FRONTEND=noninteractive
65+
ENV APT_INSTALL="apt install -y --no-install-recommends"
66+
67+
RUN apt update && apt upgrade -y && \
68+
${APT_INSTALL} ca-certificates python3-pip python-is-python3 git libopenslide0 && \
69+
apt clean && \
70+
rm -rf /usr/lib/python*/EXTERNALLY-MANAGED /var/lib/apt/lists/* && \
71+
python -m pip install --upgrade --no-cache-dir --no-build-isolation pip
72+
73+
COPY --from=build /opt/monai /opt/monai
74+
COPY --from=build /opt/tools /opt/tools
75+
ARG PYTHON_VERSION=3.11
76+
COPY --from=build /usr/local/lib/python${PYTHON_VERSION}/dist-packages /usr/local/lib/python${PYTHON_VERSION}/dist-packages
77+
COPY --from=build /usr/local/bin /usr/local/bin
78+
79+
RUN rm -rf /opt/monai/build /opt/monai/monai.egg-info && \
80+
find /opt /usr/local/lib -type d -name __pycache__ -exec rm -rf {} +
81+
82+
# flatten all layers down to one
83+
FROM ${IMAGE}
84+
LABEL maintainer="monai.contact@gmail.com"
85+
86+
COPY --from=build2 / /
87+
88+
WORKDIR /opt/monai
89+
90+
ENV PATH=${PATH}:/opt/tools:/opt/tools/ngc-cli
91+
ENV POLYGRAPHY_AUTOINSTALL_DEPS=1
92+
ENV CUDA_HOME=/usr/local/cuda
93+
ENV BUILD_MONAI=1

README.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,14 @@ Examples and notebook tutorials are located at [Project-MONAI/tutorials](https:/
6161

6262
Technical documentation is available at [docs.monai.io](https://docs.monai.io).
6363

64+
## Docker
65+
66+
The MONAI Docker image is available from [Dockerhub](https://hub.docker.com/r/projectmonai/monai),
67+
tagged as `latest` for the latest state of `dev` or with a release version. A slimmed down image can also be built
68+
locally using `Dockerfile.slim`, see that file for instructions.
69+
70+
To get started with the latest MONAI, use `docker run -ti --rm --gpus all projectmonai/monai:latest /bin/bash`.
71+
6472
## Citation
6573

6674
If you have used MONAI in your research, please cite us! The citation can be exported from: <https://arxiv.org/abs/2211.02701>.

monai/apps/vista3d/inferer.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,13 +86,13 @@ def point_based_window_inferer(
8686
for j in range(len(ly_)):
8787
for k in range(len(lz_)):
8888
lx, rx, ly, ry, lz, rz = (lx_[i], rx_[i], ly_[j], ry_[j], lz_[k], rz_[k])
89-
unravel_slice = [
89+
unravel_slice = (
9090
slice(None),
9191
slice(None),
9292
slice(int(lx), int(rx)),
9393
slice(int(ly), int(ry)),
9494
slice(int(lz), int(rz)),
95-
]
95+
)
9696
batch_image = image[unravel_slice]
9797
output = predictor(
9898
batch_image,

monai/networks/nets/vista3d.py

Lines changed: 4 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -244,14 +244,10 @@ def connected_components_combine(
244244
_logits = logits[mapping_index]
245245
inside = []
246246
for i in range(_logits.shape[0]):
247-
inside.append(
248-
np.any(
249-
[
250-
_logits[i, 0, p[0], p[1], p[2]].item() > 0
251-
for p in point_coords[i].cpu().numpy().round().astype(int)
252-
]
253-
)
254-
)
247+
p_coord = point_coords[i].cpu().numpy().round().astype(int)
248+
inside_p = [_logits[i, 0, p[0], p[1], p[2]].item() > 0 for p in p_coord]
249+
inside.append(int(np.any(inside_p))) # convert to int to avoid typing problems with Numpy
250+
255251
inside_tensor = torch.tensor(inside).to(logits.device)
256252
nan_mask = torch.isnan(_logits)
257253
# _logits are converted to binary [B1, 1, H, W, D]

monai/networks/utils.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -713,7 +713,7 @@ def convert_to_onnx(
713713
torch_versioned_kwargs = {}
714714
if use_trace:
715715
# let torch.onnx.export to trace the model.
716-
mode_to_export = model
716+
model_to_export = model
717717
torch_versioned_kwargs = kwargs
718718
if "dynamo" in kwargs and kwargs["dynamo"] and verify:
719719
torch_versioned_kwargs["verify"] = verify
@@ -726,9 +726,9 @@ def convert_to_onnx(
726726
# pass the raw nn.Module directly—the exporter handles it via torch.export.
727727
_pt_major_minor = tuple(int(x) for x in torch.__version__.split("+")[0].split(".")[:2])
728728
if _pt_major_minor >= (2, 9):
729-
mode_to_export = model
729+
model_to_export = model
730730
else:
731-
mode_to_export = torch.jit.script(model, **kwargs)
731+
model_to_export = torch.jit.script(model, **kwargs)
732732

733733
if torch.is_tensor(inputs) or isinstance(inputs, dict):
734734
onnx_inputs = (inputs,)
@@ -741,7 +741,7 @@ def convert_to_onnx(
741741
else:
742742
f = filename
743743
torch.onnx.export(
744-
mode_to_export,
744+
model_to_export,
745745
onnx_inputs,
746746
f=f,
747747
input_names=input_names,

requirements-dev.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Full requirements for developments
22
-r requirements-min.txt
3-
pytorch-ignite==0.4.11
3+
pytorch-ignite
44
gdown>=4.7.3
55
scipy>=1.12.0; python_version >= '3.9'
66
itk>=5.2

tests/bundle/test_bundle_download.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
import os
1616
import tempfile
1717
import unittest
18-
from unittest.case import skipUnless
18+
from unittest.case import skipIf, skipUnless
1919
from unittest.mock import patch
2020

2121
import numpy as np
@@ -219,6 +219,7 @@ def test_monaihosting_url_download_bundle(self, bundle_files, bundle_name, url):
219219

220220
@parameterized.expand([TEST_CASE_5])
221221
@skip_if_quick
222+
@skipIf(os.getenv("NGC_API_KEY", None) is None, "NGC API key required for this test")
222223
def test_ngc_private_source_download_bundle(self, bundle_files, bundle_name, _url):
223224
with skip_if_downloading_fails():
224225
# download a single file from url, also use `args_file`

tests/data/meta_tensor/test_meta_tensor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -245,7 +245,7 @@ def test_pickling(self):
245245
with tempfile.TemporaryDirectory() as tmp_dir:
246246
fname = os.path.join(tmp_dir, "im.pt")
247247
torch.save(m, fname)
248-
m2 = torch.load(fname, weights_only=True)
248+
m2 = torch.load(fname, weights_only=False)
249249
self.check(m2, m, ids=False)
250250

251251
@skip_if_no_cuda

0 commit comments

Comments
 (0)