Commit 4b1777f
Fix batch size broadcasting bug in GeneralizedWassersteinDiceLoss (#8744)
**Fixes #4650**
### Description
When `batch_size > 1`, `GeneralizedWassersteinDiceLoss` produces
incorrect loss values because of a tensor broadcasting issue in
`_compute_generalized_true_positive` and `_compute_denominator`.
After `torch.gather`, `alpha_extended` has shape `(B, 1, S)` while
`wasserstein_distance_map` has shape `(B, S)`. The element-wise multiply
silently broadcasts to `(B, B, S)`, which mixes values across batch
samples. This means the loss has always been wrong for any training run
with `batch_size > 1`.
The fix follows the [reference
implementation](https://github.com/LucasFidon/GeneralizedWassersteinDiceLoss)
by the original paper's author — squeeze `dim=1` after the gather so
both tensors are `(B, S)`, and reduce with `dim=1` instead of `dim=[1,
2]`.
I also noticed that `reduction="none"` was broken (never had test
coverage) — it tried to reshape the per-sample loss `(B,)` into `(B, C,
1, ...)`, but GWDL aggregates over classes internally so the class
dimension doesn't exist in the output. Fixed that as well.
### Changes
- `monai/losses/dice.py`: squeeze + dim fix in
`_compute_generalized_true_positive` and `_compute_denominator`; fixed
`reduction="none"` path
- `tests/losses/test_generalized_wasserstein_dice_loss.py`: two new
regression tests for batch consistency
### Tests
All existing tests pass. The new regression tests fail on unpatched code
and pass with the fix.
---------
Signed-off-by: hongjie-qiu <77599736+hongjie-qiu@users.noreply.github.com>
Signed-off-by: Jeffrey Qiu <hongjie.qiu@gmail.com>
Co-authored-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>1 parent 0b0a840 commit 4b1777f
File tree
2 files changed
+91
-6
lines changed- monai/losses
- tests/losses
2 files changed
+91
-6
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
548 | 548 | | |
549 | 549 | | |
550 | 550 | | |
551 | | - | |
552 | | - | |
553 | | - | |
554 | | - | |
| 551 | + | |
| 552 | + | |
555 | 553 | | |
556 | 554 | | |
557 | 555 | | |
| |||
609 | 607 | | |
610 | 608 | | |
611 | 609 | | |
| 610 | + | |
612 | 611 | | |
613 | | - | |
| 612 | + | |
614 | 613 | | |
615 | 614 | | |
616 | 615 | | |
| |||
626 | 625 | | |
627 | 626 | | |
628 | 627 | | |
| 628 | + | |
629 | 629 | | |
630 | | - | |
| 630 | + | |
631 | 631 | | |
632 | 632 | | |
633 | 633 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
218 | 218 | | |
219 | 219 | | |
220 | 220 | | |
| 221 | + | |
| 222 | + | |
| 223 | + | |
| 224 | + | |
| 225 | + | |
| 226 | + | |
| 227 | + | |
| 228 | + | |
| 229 | + | |
| 230 | + | |
| 231 | + | |
| 232 | + | |
| 233 | + | |
| 234 | + | |
| 235 | + | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
| 250 | + | |
| 251 | + | |
| 252 | + | |
| 253 | + | |
| 254 | + | |
| 255 | + | |
| 256 | + | |
| 257 | + | |
| 258 | + | |
| 259 | + | |
| 260 | + | |
| 261 | + | |
| 262 | + | |
| 263 | + | |
| 264 | + | |
| 265 | + | |
| 266 | + | |
| 267 | + | |
| 268 | + | |
| 269 | + | |
| 270 | + | |
| 271 | + | |
| 272 | + | |
| 273 | + | |
| 274 | + | |
| 275 | + | |
| 276 | + | |
| 277 | + | |
| 278 | + | |
| 279 | + | |
| 280 | + | |
| 281 | + | |
| 282 | + | |
| 283 | + | |
| 284 | + | |
| 285 | + | |
| 286 | + | |
| 287 | + | |
| 288 | + | |
| 289 | + | |
| 290 | + | |
| 291 | + | |
| 292 | + | |
| 293 | + | |
| 294 | + | |
| 295 | + | |
| 296 | + | |
| 297 | + | |
| 298 | + | |
| 299 | + | |
| 300 | + | |
| 301 | + | |
| 302 | + | |
| 303 | + | |
| 304 | + | |
| 305 | + | |
221 | 306 | | |
222 | 307 | | |
223 | 308 | | |
| |||
0 commit comments