|
4 | 4 |
|
5 | 5 | In an algorithm‑centric world, the “measurement devices” are complex, |
6 | 6 | evolving data‑processing codes rather than static laboratory |
7 | | -instruments. In this setting, the classical GUM equations, which assume |
8 | | -a fixed analytical model, a fixed data flow, and analytical Jacobians, |
9 | | -offer limited practical help: the true forward map is the current state |
10 | | -of the code, and this changes as algorithms, implementations, |
11 | | -and dependencies evolve. Algorithmic differentiation provides a better |
12 | | -foundation because it derives local linearizations directly from the |
13 | | -implementation whenever needed, so sensitivity information automatically |
14 | | -stays consistent with the code. Combined with random sampling and |
15 | | -related numerical methods for strongly nonlinear behaviour, this enables |
| 7 | +instruments. In this setting, the classical [GUM](https://doi.org/10.59161/JCGMGUM-1-2023) |
| 8 | +equations, which assume a fixed analytical model, a fixed data flow, |
| 9 | +and analytical Jacobians, offer limited practical help: the true forward |
| 10 | +map is the current state of the code, and this changes as algorithms, |
| 11 | +implementations, and dependencies evolve. Algorithmic differentiation |
| 12 | +provides a better foundation because it derives local linearizations |
| 13 | +directly from the implementation whenever needed, so sensitivity |
| 14 | +information automatically stays consistent with the code. Combined with |
| 15 | +random sampling methods for strongly nonlinear behaviour, this enables |
16 | 16 | uncertainty propagation to be defined in terms of algorithmically |
17 | 17 | differentiable programs. This framework treats inputs, outputs, and |
18 | 18 | uncertainties as tensor‑valued objects rather than forcing everything |
|
0 commit comments