Skip to content

Commit 7a83fd6

Browse files
committed
1 parent 75f3dc6 commit 7a83fd6

File tree

2 files changed

+11
-5
lines changed

2 files changed

+11
-5
lines changed

latest/docs/autojac/jac_to_grad/index.html

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -296,7 +296,16 @@ <h1>jac_to_grad<a class="headerlink" href="#jac-to-grad" title="Link to this hea
296296
<dl class="py function">
297297
<dt class="sig sig-object py" id="torchjd.autojac.jac_to_grad">
298298
<span class="sig-prename descclassname"><span class="pre">torchjd.autojac.</span></span><span class="sig-name descname"><span class="pre">jac_to_grad</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">tensors</span></span></em>, <em class="sig-param"><span class="positional-only-separator o"><abbr title="Positional-only parameter separator (PEP 570)"><span class="pre">/</span></abbr></span></em>, <em class="sig-param"><span class="n"><span class="pre">aggregator</span></span></em>, <em class="sig-param"><span class="keyword-only-separator o"><abbr title="Keyword-only parameters separator (PEP 3102)"><span class="pre">*</span></abbr></span></em>, <em class="sig-param"><span class="n"><span class="pre">retain_jac</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">optimize_gramian_computation</span></span><span class="o"><span class="pre">=</span></span><span class="default_value"><span class="pre">False</span></span></em><span class="sig-paren">)</span><a class="reference external" href="https://github.com/SimplexLab/TorchJD/blob/main/src/torchjd/autojac/_jac_to_grad.py#L47-L146"><span class="viewcode-link"><span class="pre">[source]</span></span></a><a class="headerlink" href="#torchjd.autojac.jac_to_grad" title="Link to this definition"></a></dt>
299-
<dd><p>Aggregates the Jacobians stored in the <code class="docutils literal notranslate"><span class="pre">.jac</span></code> fields of <code class="docutils literal notranslate"><span class="pre">tensors</span></code> and accumulates the result
299+
<dd><dl class="field-list simple">
300+
<dt class="field-odd">Overloads<span class="colon">:</span></dt>
301+
<dd class="field-odd"><ul class="simple">
302+
<li><p><strong>tensors</strong> (<span class="sphinx_autodoc_typehints-type">Iterable[Tensor]</span>), <strong>aggregator</strong> (<span class="sphinx_autodoc_typehints-type">GramianWeightedAggregator</span>), <strong>retain_jac</strong> (<span class="sphinx_autodoc_typehints-type">bool</span>), <strong>optimize_gramian_computation</strong> (<span class="sphinx_autodoc_typehints-type">bool</span>) → <span class="sphinx_autodoc_typehints-type">Tensor</span></p></li>
303+
<li><p><strong>tensors</strong> (<span class="sphinx_autodoc_typehints-type">Iterable[Tensor]</span>), <strong>aggregator</strong> (<span class="sphinx_autodoc_typehints-type">WeightedAggregator</span>), <strong>retain_jac</strong> (<span class="sphinx_autodoc_typehints-type">bool</span>) → <span class="sphinx_autodoc_typehints-type">Tensor</span></p></li>
304+
<li><p><strong>tensors</strong> (<span class="sphinx_autodoc_typehints-type">Iterable[Tensor]</span>), <strong>aggregator</strong> (<span class="sphinx_autodoc_typehints-type">Aggregator</span>), <strong>retain_jac</strong> (<span class="sphinx_autodoc_typehints-type">bool</span>) → <span class="sphinx_autodoc_typehints-type">None</span></p></li>
305+
</ul>
306+
</dd>
307+
</dl>
308+
<p>Aggregates the Jacobians stored in the <code class="docutils literal notranslate"><span class="pre">.jac</span></code> fields of <code class="docutils literal notranslate"><span class="pre">tensors</span></code> and accumulates the result
300309
into their <code class="docutils literal notranslate"><span class="pre">.grad</span></code> fields.</p>
301310
<dl class="field-list simple">
302311
<dt class="field-odd">Parameters<span class="colon">:</span></dt>
@@ -315,9 +324,6 @@ <h1>jac_to_grad<a class="headerlink" href="#jac-to-grad" title="Link to this hea
315324
advise to try this optimization if memory is an issue for you. Defaults to <code class="docutils literal notranslate"><span class="pre">False</span></code>.</p></li>
316325
</ul>
317326
</dd>
318-
<dt class="field-even">Return type<span class="colon">:</span></dt>
319-
<dd class="field-even"><p><span class="sphinx_autodoc_typehints-type"><a class="reference external" href="https://docs.pytorch.org/docs/stable/tensors.html#torch.Tensor" title="(in PyTorch v2.10)"><code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code></a> | <a class="reference external" href="https://docs.python.org/3/library/constants.html#None" title="(in Python v3.14)"><code class="xref py py-obj docutils literal notranslate"><span class="pre">None</span></code></a></span></p>
320-
</dd>
321327
</dl>
322328
<div class="admonition note">
323329
<p class="admonition-title">Note</p>

0 commit comments

Comments
 (0)