Skip to content

Commit 59e7942

Browse files
committed
Remove grouping example
- The plan is to add it back in another PR
1 parent 1933dea commit 59e7942

3 files changed

Lines changed: 0 additions & 179 deletions

File tree

docs/source/examples/grouping.rst

Lines changed: 0 additions & 167 deletions
This file was deleted.

docs/source/examples/index.rst

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,6 @@ This section contains some usage examples for TorchJD.
2929
- :doc:`PyTorch Lightning Integration <lightning_integration>` showcases how to combine
3030
TorchJD with PyTorch Lightning, by providing an example implementation of a multi-task
3131
``LightningModule`` optimized by Jacobian descent.
32-
- :doc:`Grouping <grouping>` shows how to apply an aggregator independently per parameter group
33-
(e.g. per layer), so that conflict resolution happens at a finer granularity than the full
34-
shared parameter vector.
3532
- :doc:`Automatic Mixed Precision <amp>` shows how to combine mixed precision training with TorchJD.
3633

3734
.. toctree::
@@ -46,4 +43,3 @@ This section contains some usage examples for TorchJD.
4643
monitoring.rst
4744
lightning_integration.rst
4845
amp.rst
49-
grouping.rst

src/torchjd/aggregation/_gradvac.py

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -41,14 +41,6 @@ class GradVac(GramianWeightedAggregator):
4141
For each task :math:`i`, the order of other tasks :math:`j` is shuffled independently
4242
using the global PyTorch RNG (``torch.randperm``). Seed it with ``torch.manual_seed`` if
4343
you need reproducibility.
44-
45-
.. note::
46-
To apply GradVac with per-layer or per-parameter-group granularity, create a separate
47-
:class:`GradVac` instance for each group and call
48-
:func:`~torchjd.autojac.jac_to_grad` once per group after
49-
:func:`~torchjd.autojac.mtl_backward`. Each instance maintains its own EMA state,
50-
matching the per-block targets :math:`\hat{\phi}_{ijk}` from the original paper. See
51-
the :doc:`Grouping </examples/grouping>` example for details.
5244
"""
5345

5446
def __init__(self, beta: float = 0.5, eps: float = 1e-8) -> None:

0 commit comments

Comments
 (0)