Skip to content

Commit 511fce3

Browse files
authored
Revise README for clarity and add GUM reference
Updated the README to include a reference to GUM and improved clarity on algorithmic differentiation and uncertainty propagation.
1 parent 1f85a8c commit 511fce3

1 file changed

Lines changed: 9 additions & 9 deletions

File tree

README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,15 +4,15 @@
44

55
In an algorithm‑centric world, the “measurement devices” are complex,
66
evolving data‑processing codes rather than static laboratory
7-
instruments. In this setting, the classical GUM equations, which assume
8-
a fixed analytical model, a fixed data flow, and analytical Jacobians,
9-
offer limited practical help: the true forward map is the current state
10-
of the code, and this changes as algorithms, implementations,
11-
and dependencies evolve. Algorithmic differentiation provides a better
12-
foundation because it derives local linearizations directly from the
13-
implementation whenever needed, so sensitivity information automatically
14-
stays consistent with the code. Combined with random sampling and
15-
related numerical methods for strongly nonlinear behaviour, this enables
7+
instruments. In this setting, the classical [GUM](https://doi.org/10.59161/JCGMGUM-1-2023)
8+
equations, which assume a fixed analytical model, a fixed data flow,
9+
and analytical Jacobians, offer limited practical help: the true forward
10+
map is the current state of the code, and this changes as algorithms,
11+
implementations, and dependencies evolve. Algorithmic differentiation
12+
provides a better foundation because it derives local linearizations
13+
directly from the implementation whenever needed, so sensitivity
14+
information automatically stays consistent with the code. Combined with
15+
random sampling methods for strongly nonlinear behaviour, this enables
1616
uncertainty propagation to be defined in terms of algorithmically
1717
differentiable programs. This framework treats inputs, outputs, and
1818
uncertainties as tensor‑valued objects rather than forcing everything

0 commit comments

Comments
 (0)