Skip to content

Commit 1a246d5

Browse files
committed
Fix CI failures and address CodeRabbit review comments
CI fixes: - Apply black --skip-magic-trailing-comma formatting - Fix isort ordering in monai/data/__init__.py and test files - Rename D/S to dynamic/static (ruff N806) - Skip torch.export tests on PyTorch < 2.6 (test_export_save early return) - Skip TestConvertToExport class on PyTorch < 2.6 - Skip test_export_with_dynamic_shapes on PyTorch < 2.9 - Revert incompatible networks to test_script_save (regunet, basic_unet, fpn, globalnet, voxelmorph, basic_unetplusplus, wasserstein_dice, masked_loss) - Switch test_varnet to test_export_save (CoilSensitivityModel changes broke TorchScript compat) - Add test_bundle_export_checkpoint to min_tests skip list - Merge dev into torchscript_deprecation CodeRabbit review fixes: - Fix grammar in mb_specification.rst - Document map_location ignored for .pt2 in mmars.py - Make load_ts_module and load_exported_module mutually exclusive - Handle meta_file as Sequence in export_checkpoint via ensure_tuple - Add output length check before zip in convert_to_export verify - Add output length check before zip in convert_to_trt verify - Guard METADATA_FILENAME from being overwritten in save_exported_program - Guard empty weight tensors in DiceLoss and FocalLoss - Suppress deprecation warning in convert_to_trt fallback path - Use save_exported_program for dynamo objects in _save_trt_model - Sort __all__ in export_utils.py (RUF022) - Add dict support to _recursive_to helper - Fix stale comment in test_retinanet ONNX test - Rename unused expected_shape param in test_varnet Signed-off-by: Soumya Snigdha Kundu <soumya_snigdha.kundu@kcl.ac.uk>
1 parent 2e8a7ab commit 1a246d5

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+387
-142
lines changed

.dockerignore

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,15 @@
33
__pycache__/
44
docs/
55

6+
.vscode
7+
.git
8+
.mypy_cache
9+
.ruff_cache
10+
.pytype
611
.coverage
712
.coverage.*
813
.coverage/
914
coverage.xml
1015
.readthedocs.yml
11-
*.toml
1216

1317
!README.md

CONTRIBUTING.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -380,7 +380,8 @@ All code review comments should be specific, constructive, and actionable.
380380

381381
### Release a new version
382382

383-
The `dev` branch's `HEAD` always corresponds to MONAI docker image's latest tag: `projectmonai/monai:latest`.
383+
The `dev` branch's `HEAD` always corresponds to MONAI Docker image's latest tag: `projectmonai/monai:latest`. (No
384+
release is currently done for the slim MONAI image, this is built locally by users.)
384385
The `main` branch's `HEAD` always corresponds to the latest MONAI milestone release.
385386

386387
When major features are ready for a milestone, to prepare for a new release:

Dockerfile.slim

Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
# Copyright (c) MONAI Consortium
2+
# Licensed under the Apache License, Version 2.0 (the "License");
3+
# you may not use this file except in compliance with the License.
4+
# You may obtain a copy of the License at
5+
# http://www.apache.org/licenses/LICENSE-2.0
6+
# Unless required by applicable law or agreed to in writing, software
7+
# distributed under the License is distributed on an "AS IS" BASIS,
8+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9+
# See the License for the specific language governing permissions and
10+
# limitations under the License.
11+
12+
# This is a slimmed down version of the MONAI Docker image using a smaller base image and multi-stage building. Not all
13+
# NVIDIA tools will be present but all libraries and compiled code are included. This image isn't provided through
14+
# Dockerhub so users must build locally: `docker build -t monai_slim -f Dockerfile.slim .`
15+
# Containers may require more shared memory, eg.: `docker run -ti --rm --gpus all --shm-size=10gb monai_slim /bin/bash`
16+
17+
ARG IMAGE=debian:12-slim
18+
19+
FROM ${IMAGE} AS build
20+
21+
ARG TORCH_CUDA_ARCH_LIST="7.5 8.0 8.6 8.9 9.0+PTX"
22+
23+
ENV DEBIAN_FRONTEND=noninteractive
24+
ENV APT_INSTALL="apt install -y --no-install-recommends"
25+
26+
RUN apt update && apt upgrade -y && \
27+
${APT_INSTALL} ca-certificates python3-pip python-is-python3 git wget libopenslide0 unzip python3-dev && \
28+
wget https://developer.download.nvidia.com/compute/cuda/repos/debian12/x86_64/cuda-keyring_1.1-1_all.deb && \
29+
dpkg -i cuda-keyring_1.1-1_all.deb && \
30+
apt update && \
31+
${APT_INSTALL} cuda-toolkit-12 && \
32+
rm -rf /usr/lib/python*/EXTERNALLY-MANAGED /var/lib/apt/lists/* && \
33+
python -m pip install --upgrade --no-cache-dir --no-build-isolation pip
34+
35+
# TODO: remark for issue [revise the dockerfile](https://github.com/zarr-developers/numcodecs/issues/431)
36+
RUN if [[ $(uname -m) =~ "aarch64" ]]; then \
37+
CFLAGS="-O3" DISABLE_NUMCODECS_SSE2=true DISABLE_NUMCODECS_AVX2=true python -m pip install numcodecs; \
38+
fi
39+
40+
# NGC Client
41+
WORKDIR /opt/tools
42+
ARG NGC_CLI_URI="https://ngc.nvidia.com/downloads/ngccli_linux.zip"
43+
RUN wget -q ${NGC_CLI_URI} && unzip ngccli_linux.zip && chmod u+x ngc-cli/ngc && \
44+
find ngc-cli/ -type f -exec md5sum {} + | LC_ALL=C sort | md5sum -c ngc-cli.md5 && \
45+
rm -rf ngccli_linux.zip ngc-cli.md5
46+
47+
WORKDIR /opt/monai
48+
49+
# copy relevant parts of repo
50+
COPY requirements.txt requirements-min.txt requirements-dev.txt versioneer.py setup.py setup.cfg pyproject.toml ./
51+
COPY LICENSE CHANGELOG.md CODE_OF_CONDUCT.md CONTRIBUTING.md README.md MANIFEST.in runtests.sh ./
52+
COPY tests ./tests
53+
COPY monai ./monai
54+
55+
# install full deps
56+
RUN python -m pip install --no-cache-dir --no-build-isolation -r requirements-dev.txt
57+
58+
# compile ext
59+
RUN CUDA_HOME=/usr/local/cuda FORCE_CUDA=1 USE_COMPILED=1 BUILD_MONAI=1 python setup.py develop
60+
61+
# recreate the image without the installed CUDA packages then copy the installed MONAI and Python directories
62+
FROM ${IMAGE} AS build2
63+
64+
ENV DEBIAN_FRONTEND=noninteractive
65+
ENV APT_INSTALL="apt install -y --no-install-recommends"
66+
67+
RUN apt update && apt upgrade -y && \
68+
${APT_INSTALL} ca-certificates python3-pip python-is-python3 git libopenslide0 && \
69+
apt clean && \
70+
rm -rf /usr/lib/python*/EXTERNALLY-MANAGED /var/lib/apt/lists/* && \
71+
python -m pip install --upgrade --no-cache-dir --no-build-isolation pip
72+
73+
COPY --from=build /opt/monai /opt/monai
74+
COPY --from=build /opt/tools /opt/tools
75+
ARG PYTHON_VERSION=3.11
76+
COPY --from=build /usr/local/lib/python${PYTHON_VERSION}/dist-packages /usr/local/lib/python${PYTHON_VERSION}/dist-packages
77+
COPY --from=build /usr/local/bin /usr/local/bin
78+
79+
RUN rm -rf /opt/monai/build /opt/monai/monai.egg-info && \
80+
find /opt /usr/local/lib -type d -name __pycache__ -exec rm -rf {} +
81+
82+
# flatten all layers down to one
83+
FROM ${IMAGE}
84+
LABEL maintainer="monai.contact@gmail.com"
85+
86+
COPY --from=build2 / /
87+
88+
WORKDIR /opt/monai
89+
90+
ENV PATH=${PATH}:/opt/tools:/opt/tools/ngc-cli
91+
ENV POLYGRAPHY_AUTOINSTALL_DEPS=1
92+
ENV CUDA_HOME=/usr/local/cuda
93+
ENV BUILD_MONAI=1

README.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,14 @@ Examples and notebook tutorials are located at [Project-MONAI/tutorials](https:/
6161

6262
Technical documentation is available at [docs.monai.io](https://docs.monai.io).
6363

64+
## Docker
65+
66+
The MONAI Docker image is available from [Dockerhub](https://hub.docker.com/r/projectmonai/monai),
67+
tagged as `latest` for the latest state of `dev` or with a release version. A slimmed down image can also be built
68+
locally using `Dockerfile.slim`, see that file for instructions.
69+
70+
To get started with the latest MONAI, use `docker run -ti --rm --gpus all projectmonai/monai:latest /bin/bash`.
71+
6472
## Citation
6573

6674
If you have used MONAI in your research, please cite us! The citation can be exported from: <https://arxiv.org/abs/2211.02701>.

docs/source/mb_specification.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ MONAI Bundle Specification
66
Overview
77
========
88

9-
This is the specification for the MONAI Bundle (MB) format of portable described deep learning models. The objective of a MB is to define a packaged network or model which includes the critical information necessary to allow users and programs to understand how the model is used and for what purpose. A bundle includes the stored weights of a single network as a pickled state dictionary plus optionally an exported program (``.pt2``, via ``torch.export``) and/or an ONNX object. Additional JSON files are included to store metadata about the model, information for constructing training, inference, and post-processing transform sequences, plain-text description, legal information, and other data the model creator wishes to include.
9+
This is the specification for the MONAI Bundle (MB) format of portable deep learning models. The objective of a MB is to define a packaged network or model which includes the critical information necessary to allow users and programs to understand how the model is used and for what purpose. A bundle includes the stored weights of a single network as a pickled state dictionary plus optionally an exported program (``.pt2``, via ``torch.export``) and/or an ONNX object. Additional JSON files are included to store metadata about the model, information for constructing training, inference, and post-processing transform sequences, plain-text description, legal information, and other data the model creator wishes to include.
1010

1111
This specification defines the directory structure a bundle must have and the necessary files it must contain. Additional files may be included and the directory packaged into a zip file or included as extra files directly in the exported archive.
1212

monai/apps/mmars/mmars.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -206,6 +206,7 @@ def load_from_mmar(
206206
progress: whether to display a progress bar when downloading the content.
207207
version: version number of the MMAR. Set it to `-1` to use `item[Keys.VERSION]`.
208208
map_location: pytorch API parameter for ``torch.load`` or ``torch.jit.load`` (legacy ``.ts`` files).
209+
Ignored when loading ``.pt2`` (ExportedProgram) files.
209210
pretrained: whether to load the pretrained weights after initializing a network module.
210211
weights_only: whether to load only the weights instead of initializing the network module and assign weights.
211212
model_key: a key to search in the model file or config file for the model dictionary.

monai/apps/vista3d/inferer.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,13 +86,13 @@ def point_based_window_inferer(
8686
for j in range(len(ly_)):
8787
for k in range(len(lz_)):
8888
lx, rx, ly, ry, lz, rz = (lx_[i], rx_[i], ly_[j], ry_[j], lz_[k], rz_[k])
89-
unravel_slice = [
89+
unravel_slice = (
9090
slice(None),
9191
slice(None),
9292
slice(int(lx), int(rx)),
9393
slice(int(ly), int(ry)),
9494
slice(int(lz), int(rz)),
95-
]
95+
)
9696
batch_image = image[unravel_slice]
9797
output = predictor(
9898
batch_image,

monai/bundle/scripts.py

Lines changed: 15 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -719,6 +719,9 @@ def load(
719719
net_override = {} if net_override is None else net_override
720720
copy_model_args = {} if copy_model_args is None else copy_model_args
721721

722+
if load_ts_module and load_exported_module:
723+
raise ValueError("load_ts_module and load_exported_module are mutually exclusive.")
724+
722725
if load_ts_module:
723726
warnings.warn(
724727
"load_ts_module is deprecated since v1.5 and will be removed in v1.7. "
@@ -788,16 +791,12 @@ def load(
788791
)
789792
else:
790793
warnings.warn(
791-
f"Cannot find the config file: {bundle_config_file}, return state dict instead.",
792-
stacklevel=2,
794+
f"Cannot find the config file: {bundle_config_file}, return state dict instead.", stacklevel=2
793795
)
794796
return model_dict
795797
if _workflow is not None:
796798
if not hasattr(_workflow, "network_def"):
797-
warnings.warn(
798-
"No available network definition in the bundle, return state dict instead.",
799-
stacklevel=2,
800-
)
799+
warnings.warn("No available network definition in the bundle, return state dict instead.", stacklevel=2)
801800
return model_dict
802801
else:
803802
model = _workflow.network_def
@@ -1698,8 +1697,9 @@ def export_checkpoint(
16981697
parser = ConfigParser()
16991698
parser.read_config(f=config_file_)
17001699
meta_file_ = os.path.join(bundle_root, "configs", "metadata.json") if meta_file_ is None else meta_file_
1701-
if os.path.exists(meta_file_):
1702-
parser.read_meta(f=meta_file_)
1700+
for mf in ensure_tuple(meta_file_):
1701+
if os.path.exists(mf):
1702+
parser.read_meta(f=mf)
17031703

17041704
for k, v in _args.items():
17051705
parser[k] = v
@@ -1906,10 +1906,13 @@ def trt_export(
19061906
converter_kwargs_.update(trt_api_parameters)
19071907

19081908
def _save_trt_model(trt_obj, filepath, **kwargs):
1909-
"""Save TRT model without triggering deprecation warnings from internal calls."""
1910-
with warnings.catch_warnings():
1911-
warnings.filterwarnings("ignore", category=FutureWarning, message=".*save_net_with_metadata.*")
1912-
save_net_with_metadata(trt_obj, filepath, include_config_vals=False, append_timestamp=False, **kwargs)
1909+
"""Save TRT model, using the appropriate format for dynamo vs JIT objects."""
1910+
if isinstance(trt_obj, torch.export.ExportedProgram):
1911+
save_exported_program(trt_obj, filepath, include_config_vals=False, append_timestamp=False, **kwargs)
1912+
else:
1913+
with warnings.catch_warnings():
1914+
warnings.filterwarnings("ignore", category=FutureWarning, message=".*save_net_with_metadata.*")
1915+
save_net_with_metadata(trt_obj, filepath, include_config_vals=False, append_timestamp=False, **kwargs)
19131916

19141917
_export(
19151918
convert_to_trt,

monai/data/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,7 @@
4747
load_decathlon_datalist,
4848
load_decathlon_properties,
4949
)
50+
from .export_utils import load_exported_program, save_exported_program
5051
from .folder_layout import FolderLayout, FolderLayoutBase
5152
from .grid_dataset import GridPatchDataset, PatchDataset, PatchIter, PatchIterd
5253
from .image_dataset import ImageDataset
@@ -77,7 +78,6 @@
7778
from .test_time_augmentation import TestTimeAugmentation
7879
from .thread_buffer import ThreadBuffer, ThreadDataLoader
7980
from .torchscript_utils import load_net_with_metadata, save_net_with_metadata
80-
from .export_utils import load_exported_program, save_exported_program
8181
from .utils import (
8282
affine_to_spacing,
8383
compute_importance_map,

monai/data/export_utils.py

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
from monai.data.torchscript_utils import METADATA_FILENAME
2525
from monai.utils import ExportMetadataKeys
2626

27-
__all__ = ["save_exported_program", "load_exported_program"]
27+
__all__ = ["load_exported_program", "save_exported_program"]
2828

2929

3030
def save_exported_program(
@@ -76,6 +76,8 @@ def save_exported_program(
7676
extra_files: dict[str, Any] = {METADATA_FILENAME: json_data}
7777

7878
if more_extra_files is not None:
79+
if METADATA_FILENAME in more_extra_files:
80+
raise ValueError(f"'{METADATA_FILENAME}' is reserved and cannot be used in more_extra_files.")
7981
extra_files.update(more_extra_files)
8082

8183
# torch.export.save requires str values; decode bytes from legacy callers (e.g. _export helper)
@@ -96,8 +98,7 @@ def save_exported_program(
9698

9799

98100
def load_exported_program(
99-
filename_prefix_or_stream: str | os.PathLike | IO[bytes],
100-
more_extra_files: Sequence[str] = (),
101+
filename_prefix_or_stream: str | os.PathLike | IO[bytes], more_extra_files: Sequence[str] = ()
101102
) -> tuple[torch.export.ExportedProgram, dict, dict]:
102103
"""
103104
Load an ``ExportedProgram`` from a ``.pt2`` file and extract stored JSON metadata.

0 commit comments

Comments
 (0)