Skip to content

Commit a748a59

Browse files
Merge branch 'dev' into 8328-nnunet-bundle-integration
2 parents c90e960 + d388d1c commit a748a59

15 files changed

Lines changed: 544 additions & 123 deletions

.github/workflows/weekly-preview.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ jobs:
6666
export YEAR_WEEK=$(date +'%y%U')
6767
echo "Year week for tag is ${YEAR_WEEK}"
6868
if ! [[ $YEAR_WEEK =~ ^[0-9]{4}$ ]] ; then echo "Wrong 'year week' format. Should be 4 digits."; exit 1 ; fi
69-
git tag "1.5.dev${YEAR_WEEK}"
69+
git tag "1.6.dev${YEAR_WEEK}"
7070
git log -1
7171
git tag --list
7272
python setup.py sdist bdist_wheel

CHANGELOG.md

Lines changed: 99 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,103 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
55

66
## [Unreleased]
77

8+
## [1.5.0] - 2025-06-13
9+
10+
## What's Changed
11+
### Added
12+
* Add platform-specific constraints to setup.cfg (#8260)
13+
* Add PythonicWorkflow (#8151)
14+
* Add SM architecture version check (#8199)
15+
* Add MedNext implementation (#8004)
16+
* Added a top button to CONSTRIBUTING.md (#8163)
17+
* Adding CODEOWNERS (#8457)
18+
* Restormer Implementation (#8312)
19+
* Add rectified flow noise scheduler for accelerated diffusion model (#8374)
20+
* Add prediction type for rflow scheduler (#8386)
21+
* Add Average Precision to metrics (#8089)
22+
* Implementation of a Masked Autoencoder for representation learning (#8152)
23+
* Implement TorchIO transforms wrapper analogous to TorchVision transfo… (#7579)
24+
* 8328 nnunet bundle integration (#8329)
25+
* Adding Support Policy + Doc Updates (#8458)
26+
* Classifier free guidance (#8460)
27+
28+
### Fixed
29+
* Fix Ruff Numpy2 deprecation rules (#8179)
30+
* Fix `torch.load()` frequently warning in PersistentDataset and GDSDataset (#8177)
31+
* Fix the logging of a nested dictionary metric in MLflow (#8169)
32+
* Fix ImageFilter to allow Gaussian filter without filter_size (#8189)
33+
* Fix fold_constants, test_handler switched to onnx (#8211)
34+
* Fix TypeError in meshgrid (#8252)
35+
* Fix PatchMerging duplicate merging (#8285)
36+
* Fix test load image issue (#8297)
37+
* Fix bundle download error from ngc source (#8307)
38+
* Fix deprecated usage in zarr (#8313, #8477)
39+
* Fix DataFrame subsets indexing in CSVDataset() (#8351)
40+
* Fix `packaging` imports in version comparison logic (#8347)
41+
* Fix CommonKeys docstring (#8342)
42+
* Fix: correctly apply fftshift to real-valued data inputs (#8407)
43+
* Fix OptionalImportError: required package `openslide` is not installed (#8419)
44+
* Fix cosine noise scheduler (#8427)
45+
* Fix AutoencoderKL docstrings. (#8445)
46+
* Inverse Threading Fix (#8418)
47+
* Fix normalize intensity (#8286)
48+
* Fix path at test onnx trt export (#8361)
49+
* Fix broken urls (#8481, #8483)
50+
51+
### Changed
52+
* [DOC] Update README.md (#8157)
53+
* Streamlined Rearrange in SpatialAttentionBlock (#8130)
54+
* Optimize VISTA3D (#8123)
55+
* Skip torch trt convert test with torch newer than or equal to 2.5.0 (#8165)
56+
* Enable redirection of all loggers by configuring a FileHandler within the bundle (#8142)
57+
* Apply pyupgrade fixes for Python 3.9+ syntax (#8150)
58+
* Update base image to 2410 (#8164)
59+
* TRT support for MAISI (#8153)
60+
* 8134 Add unit test for responsive inference (#8146)
61+
* SwinUNETR refactor to accept additional parameters (#8212)
62+
* Allow an arbitrary mask to be used in the self attention (#8235)
63+
* Bump codecov/codecov-action from 4 to 5 (#8245)
64+
* Docs: update brats classes description (#8246)
65+
* Change default value of `patch_norm` to False in `SwinUNETR` (#8249)
66+
* Modify Dice, Jaccard and Tversky losses (#8138)
67+
* Modify Workflow to Allow IterableDataset Inputs (#8263)
68+
* Enhance download_and_extract (#8216)
69+
* Relax gpu load check (#8282, #8275)
70+
* Using LocalStore in Zarr v3 (#8299)
71+
* Enable gpu load nifti (#8188)
72+
* update pydicom reader to enable gpu load (#8283)
73+
* Zarr compression tests only with versions before 3.0 (#8319)
74+
* Changing utils.py to test_utils.py (#8335)
75+
* Refactor testd (#8231)
76+
* Recursive Item Mapping for Nested Lists in Compose (#8187)
77+
* Bump min torch to 1.13.1 to mitigate CVE-2022-45907 unsafe usage of eval (#8296)
78+
* Inferer modification - save_intermediates clashes with latent shape adjustment in latent diffusion inferers (#8343)
79+
* Solves path problem in test_bundle_trt_export.py (#8357)
80+
* Modify ControlNet inferer so that it takes in context when the diffus… (#8360)
81+
* Update monaihosting download method (#8364)
82+
* Bump torch minimum to mitigate CVE-2024-31580 & CVE-2024-31583 and enable numpy 2 compatibility (#8368)
83+
* Auto3DSeg algo_template hash update (#8378)
84+
* Enable Pytorch 2.6 (#8309)
85+
* Auto3DSeg algo_template hash update (#8393, #8397)
86+
* Update Dice Metric Docs (#8388)
87+
* Auto3DSeg algo_template hash update (#8406)
88+
* Update bundle download API (#8403)
89+
* Add Skip test in TestTranschex (#8416)
90+
* Update get latest bundle version function (#8420)
91+
* Temporarily Restrict setuptools Version to 79.0.1 (#8441)
92+
* Update default overlap value in occlusion_sensitivity to 0.6 (#8446)
93+
* Enable code coverage comments on PRs in codecov configuration (#8402)
94+
* Migrate to modern Python Logger API (#8449)
95+
96+
### Deprecated
97+
### Removed
98+
* Remove deprecated functionality for v1.5 (#8430)
99+
* Remove deprecated `return_state_dict ` in bundle `load` (#8454)
100+
* Remove deprecated `net_name` in test file (#8461)
101+
* Remove unused test cases in bundle load (#8463)
102+
* selfattention block: Remove the fc linear layer if it is not used (#8325)
103+
* Removed outdated `torch` version checks from transform functions (#8359)
104+
8105
## [1.4.0] - 2024-10-17
9106
## What's Changed
10107
### Added
@@ -1132,7 +1229,8 @@ the postprocessing steps should be used before calling the metrics methods
11321229

11331230
[highlights]: https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md
11341231

1135-
[Unreleased]: https://github.com/Project-MONAI/MONAI/compare/1.4.0...HEAD
1232+
[Unreleased]: https://github.com/Project-MONAI/MONAI/compare/1.5.0...HEAD
1233+
[1.5.0]: https://github.com/Project-MONAI/MONAI/compare/1.4.0...1.5.0
11361234
[1.4.0]: https://github.com/Project-MONAI/MONAI/compare/1.3.2...1.4.0
11371235
[1.3.2]: https://github.com/Project-MONAI/MONAI/compare/1.3.1...1.3.2
11381236
[1.3.1]: https://github.com/Project-MONAI/MONAI/compare/1.3.0...1.3.1

CITATION.cff

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ title: "MONAI: Medical Open Network for AI"
66
abstract: "AI Toolkit for Healthcare Imaging"
77
authors:
88
- name: "MONAI Consortium"
9-
date-released: 2024-10-17
10-
version: "1.4.0"
9+
date-released: 2025-06-13
10+
version: "1.5.0"
1111
identifiers:
1212
- description: "This DOI represents all versions of MONAI, and will always resolve to the latest one."
1313
type: doi

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ Please refer to [the installation guide](https://docs.monai.io/en/latest/install
5454

5555
## Getting Started
5656

57-
[MedNIST demo](https://colab.research.google.com/drive/1wy8XUSnNWlhDNazFdvGBHLfdkGvOHBKe) and [MONAI for PyTorch Users](https://colab.research.google.com/drive/1boqy7ENpKrqaJoxFlbHIBnIODAs1Ih1T) are available on Colab.
57+
[MedNIST demo](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/main/2d_classification/mednist_tutorial.ipynb) and [MONAI for PyTorch Users](https://colab.research.google.com/github/Project-MONAI/tutorials/blob/main/modules/developer_guide.ipynb) are available on Colab.
5858

5959
Examples and notebook tutorials are located at [Project-MONAI/tutorials](https://github.com/Project-MONAI/tutorials).
6060

@@ -75,7 +75,7 @@ For guidance on making a contribution to MONAI, see the [contributing guidelines
7575

7676
## Community
7777

78-
Join the conversation on Twitter/X [@ProjectMONAI](https://twitter.com/ProjectMONAI) or join our [Slack channel](https://forms.gle/QTxJq3hFictp31UM9).
78+
Join the conversation on Twitter/X [@ProjectMONAI](https://twitter.com/ProjectMONAI), [LinkedIn](https://www.linkedin.com/company/projectmonai), or join our [Slack channel](https://forms.gle/QTxJq3hFictp31UM9).
7979

8080
Ask and answer questions over on [MONAI's GitHub Discussions tab](https://github.com/Project-MONAI/MONAI/discussions).
8181

docs/images/maisi_infer.png

128 KB
Loading

docs/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
-f https://download.pytorch.org/whl/cpu/torch-2.3.0%2Bcpu-cp39-cp39-linux_x86_64.whl
1+
-f https://download.pytorch.org/whl/cpu/torch-2.4.1%2Bcpu-cp39-cp39-linux_x86_64.whl
22
torch>=2.4.1, <2.7.0
33
pytorch-ignite==0.4.11
44
numpy>=1.20

docs/source/modules.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -123,8 +123,7 @@ MONAI implements reference networks with the aim of both flexibility and code re
123123

124124
Network layers and blocks are in general implemented to be compatible with spatial 1D, 2D and 3D inputs.
125125
Users can easily integrate the layers, blocks and networks as part of their customised pipelines.
126-
Various utilities are provided to leverage the existing model weights, e.g., finetuning [from MMAR](https://github.com/Project-MONAI/tutorials/blob/main/modules/transfer_mmar.ipynb)
127-
or [from a bundle in MONAI model-zoo](https://github.com/Project-MONAI/tutorials/tree/main/model_zoo).
126+
Various utilities are provided to leverage the existing model weights, e.g. [from a bundle in MONAI model-zoo](https://github.com/Project-MONAI/tutorials/tree/main/model_zoo).
128127

129128
### C++/CUDA optimized modules
130129

@@ -231,8 +230,8 @@ A typical process of `decollate batch` is illustrated as follows (with a `batch_
231230

232231
Except for the pytorch-ignite based `monai.engines`, most of the MONAI modules could be used independently or combined
233232
with other software packages. For example, MONAI can be easily integrated into popular frameworks such as
234-
PyTorch-Lightning and Catalyst. [[Lightning segmentation](https://github.com/Project-MONAI/tutorials/blob/main/3d_segmentation/spleen_segmentation_3d_lightning.ipynb),
235-
[Catalyst segmentation](https://github.com/Project-MONAI/tutorials/blob/main/3d_segmentation/unet_segmentation_3d_catalyst.ipynb)]
233+
[PyTorch-Lightning](https://github.com/Project-MONAI/tutorials/blob/main/3d_segmentation/spleen_segmentation_3d_lightning.ipynb)
234+
and [MLflow](https://github.com/Project-MONAI/tutorials/blob/main/experiment_management/spleen_segmentation_mlflow.ipynb).
236235

237236
## Bundle
238237

@@ -264,7 +263,7 @@ A typical bundle example can include:
264263
┗━ *license.txt
265264
```
266265
Details about the bundle config definition and syntax & examples are at [config syntax](https://docs.monai.io/en/latest/config_syntax.html).
267-
A step-by-step [get started](https://github.com/Project-MONAI/tutorials/blob/main/bundle/get_started.md) tutorial notebook can help users quickly set up a bundle. [[bundle examples](https://github.com/Project-MONAI/tutorials/tree/main/bundle), [model-zoo](https://github.com/Project-MONAI/model-zoo)]
266+
A step-by-step [get started](https://github.com/Project-MONAI/tutorials/blob/main/bundle/README.md) tutorial notebook can help users quickly set up a bundle. [[bundle examples](https://github.com/Project-MONAI/tutorials/tree/main/bundle), [model-zoo](https://github.com/Project-MONAI/model-zoo)]
268267

269268
## Federated Learning
270269

docs/source/whatsnew.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ What's New
66
.. toctree::
77
:maxdepth: 1
88

9+
whatsnew_1_5.md
910
whatsnew_1_4.md
1011
whatsnew_1_3.md
1112
whatsnew_1_2.md

docs/source/whatsnew_1_4.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# What's new in 1.4 🎉🎉
1+
# What's new in 1.4
22

33
- MAISI: state-of-the-art 3D Latent Diffusion Model
44
- VISTA-3D: interactive foundation model for segmenting and anotating human anatomies

docs/source/whatsnew_1_5.md

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
2+
# What's new in 1.5 🎉🎉
3+
4+
- Support numpy 2.x and Pytorch 2.6
5+
- MAISI inference accelerate
6+
- Bundles storage changed to huggingface and correspoinding api updated in core
7+
- Ported remaining generative tutorials and bundles
8+
- New tutorials:
9+
- [2d_regression/image_restoration.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/2d_regression/image_restoration.ipynb)
10+
- [generation/2d_diffusion_autoencoder/2d_diffusion_autoencoder_tutorial.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/generation/2d_diffusion_autoencoder/2d_diffusion_autoencoder_tutorial.ipynb)
11+
- [generation/3d_ddpm/3d_ddpm_tutorial.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/generation/3d_ddpm/3d_ddpm_tutorial.ipynb)
12+
- [generation/classifier_free_guidance/2d_ddpm_classifier_free_guidance_tutorial.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/generation/classifier_free_guidance/2d_ddpm_classifier_free_guidance_tutorial.ipynb)
13+
- [hugging_face/finetune_vista3d_for_hugging_face_pipeline.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/hugging_face/finetune_vista3d_for_hugging_face_pipeline.ipynb)
14+
- [hugging_face/hugging_face_pipeline_for_monai.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/hugging_face/hugging_face_pipeline_for_monai.ipynb)
15+
- [modules/omniverse/omniverse_integration.ipynb](https://github.com/Project-MONAI/tutorials/blob/main/modules/omniverse/omniverse_integration.ipynb)
16+
- New Bundles:
17+
- [models/cxr_image_synthesis_latent_diffusion_model](https://github.com/Project-MONAI/model-zoo/blob/dev/models/cxr_image_synthesis_latent_diffusion_model)
18+
- [models/mednist_ddpm](https://github.com/Project-MONAI/model-zoo/blob/dev/models/mednist_ddpm)
19+
- [models/brain_image_synthesis_latent_diffusion_model](https://github.com/Project-MONAI/model-zoo/blob/dev/models/mednist_ddpm)
20+
- [hf_models/exaonepath-crc-msi-predictor](https://github.com/Project-MONAI/model-zoo/blob/dev/hf_models/exaonepath-crc-msi-predictor)
21+
- All existing bundles are also now [hosted on Huggingface](https://huggingface.co/MONAI)!
22+
23+
## Supported Dependency Versions
24+
25+
This release adds support for NumPy 2.0 and PyTorch 2.6. We plan to add support for PyTorch 2.7 in an upcoming version once some compatibility issues have been addressed.
26+
27+
As stated in the updated [README.md](https://github.com/Project-MONAI/MONAI/blob/main/README.md) file, MONAI's policy for the support of dependency versions has been updated for clarity.
28+
29+
MONAI will continue to support [currently supported versions of Python](https://devguide.python.org/versions), and for other dependencies the following apply:
30+
31+
* Major releases of MONAI will have dependency versions stated for them. The current state of the `dev` branch in this repository is the unreleased development version of MONAI which typically will support current versions of dependencies and include updates and bug fixes to do so.
32+
* PyTorch support covers [the current version](https://github.com/pytorch/pytorch/releases) plus three previous minor versions. If compatibility issues with a PyTorch version and other dependencies arise, support for a version may be delayed until a major release.
33+
* Our support policy for other dependencies adheres for the most part to [SPEC0](https://scientific-python.org/specs/spec-0000), where dependency versions are supported where possible for up to two years. Discovered vulnerabilities or defects may require certain versions to be explicitly not supported.
34+
* See the `requirements*.txt` files for dependency version information.
35+
36+
## MAISI Update: Introducing MAISI Version maisi3d-rflow
37+
38+
![maisi](../images/maisi_infer.png)
39+
40+
We are excited to announce the release of MAISI Version _maisi3d-rflow_. This update brings significant improvements over the previous version, _maisi3d-ddpm_, with a remarkable 33x acceleration in latent diffusion model inference speed. The MAISI VAE remains unchanged. Here are the key differences:
41+
1. Scheduler Update:
42+
43+
* _maisi3d-ddpm_: Uses the basic DDPM noise scheduler.
44+
45+
* _maisi3d-rflow_: Introduces the Rectified Flow scheduler, allowing diffusion model inference to be 33 times faster.
46+
2. Training Data Preparation:
47+
48+
* _maisi3d-ddpm_: Requires training images to be labeled with body regions (specifically “top_region_index” and “bottom_region_index”).
49+
50+
* _maisi3d-rflow_: No such labeling is required, making it easier to prepare the training data.
51+
3. Image Quality:
52+
53+
* For the released model weights, _maisi3d-rflow_ generates better-quality images for head regions and smaller output volumes compared to _maisi3d-ddpm_. For other regions, the image quality is comparable.
54+
4. Modality Input:
55+
56+
* _maisi3d-rflow_ adds a new modality input to the diffusion model, offering flexibility for future extensions to other modalities. Currently, this input is set to always equal 1, as this version supports CT generation exclusively.

0 commit comments

Comments
 (0)