Skip to content

Commit 6fdbbf8

Browse files
authored
Refine README content on algorithmic differentiation
Updated terminology and improved clarity in the README.
1 parent 68586ac commit 6fdbbf8

1 file changed

Lines changed: 13 additions & 14 deletions

File tree

README.md

Lines changed: 13 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -6,29 +6,28 @@ In an algorithm‑centric world, the “measurement devices” are complex,
66
evolving data‑processing codes rather than static laboratory
77
instruments. In this setting, the classical [GUM](https://doi.org/10.59161/JCGMGUM-1-2023)
88
equations, which assume a fixed analytical model, a fixed data flow,
9-
and analytical Jacobians, offer limited practical help: the true forward
9+
and hand‑managed analytical Jacobians, offer limited practical help: the true forward
1010
map is the current state of the code, and this changes as algorithms,
11-
implementations, and dependencies evolve. Algorithmic differentiation
12-
provides a better foundation because it derives local linearizations
13-
directly from the implementation whenever needed, so sensitivity
14-
information automatically stays consistent with the code. Combined with
15-
random sampling methods for strongly nonlinear behaviour, this enables
16-
uncertainty propagation to be defined in terms of algorithmically
17-
differentiable programs. This framework treats inputs, outputs, and
18-
uncertainties as tensor‑valued objects rather than forcing everything
11+
implementations, and dependencies evolve. Algorithmic differentiation (AD)
12+
provides a better foundation because it derives local linearizations directly
13+
from the implementation whenever needed, so sensitivity information automatically
14+
stays consistent with the code. Combined with random sampling methods for strongly
15+
nonlinear behaviour, this enables uncertainty propagation to be defined in terms
16+
of algorithmically differentiable programs. AD frameworks treat inputs, outputs,
17+
and uncertainties as tensor‑valued objects rather than forcing the data processing
1918
into a fixed set of closed‑form formulas.
2019

2120
The ideas presented here grew out of earlier project-specific implementations
22-
of algorithmic-differentiation-based uncertainty propagation for harmonised
23-
satellite calibration workflows underpinning fundamental climate data records.
21+
of AD-based uncertainty propagation for harmonised satellite calibration
22+
workflows underpinning fundamental climate data records.
2423

2524
## Synopsis
2625

2726
**Uncertaintyx** is a lightweight framework for tensor‑level uncertainty
2827
propagation, fitting of empirical or physics-informed models, and
2928
metrology‑aware workflows. It produces uncertainty tensors by combining
30-
tensor‑valued models with algorithmic (a.k.a. automatic) differentiation
31-
backends such as [JAX](https://docs.jax.dev/). Conventional [NumPy](https://numpy.org)
29+
tensor‑valued models with AD backends such as [JAX](https://docs.jax.dev/).
30+
Conventional [NumPy](https://numpy.org)
3231
acts as a bidirectional interoperability layer, enabling JAX‑based code
3332
to interoperate smoothly with existing workflows.
3433

@@ -47,7 +46,7 @@ with the tensor equation and code further below.
4746
or Monte Carlo often struggle with scalability for high-dimensional
4847
tensors, demanding extensive evaluations or approximations that compromise
4948
fidelity. Frameworks like JAX, facilitating GPUs and TPUs besides CPUs,
50-
make algorithmic differentiation a game changer, automatically generating
49+
make differentiation a game changer, automatically generating
5150
exact derivatives—even for complex, nonlinear models—at machine precision
5251
to produce Jacobians and Hessians seamlessly. This approach efficiently
5352
propagates full covariance structures while honouring spatiotemporal

0 commit comments

Comments
 (0)