We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent ac79141 commit f81780aCopy full SHA for f81780a
1 file changed
README.md
@@ -12,8 +12,8 @@ algorithms, implementations, and dependencies evolve. Algorithmic
12
differentiation (AD) provides a better foundation because it derives local
13
linearizations directly from the implementation whenever needed, so sensitivity
14
information automatically stays consistent with the code. Combined with random
15
-sampling methods for strongly nonlinear behaviour, this defines uncertainty
16
-propagation in terms of algorithmically differentiable programs. AD frameworks
+sampling methods for strongly nonlinear behaviour, uncertainty propagation
+is defined in terms of algorithmically differentiable programs. AD frameworks
17
treat inputs, outputs, sensitivities, and uncertainties as dynamic tensor‑valued
18
objects rather than forcing the uncertainty calculus into a fixed set of
19
closed‑form formulas.
0 commit comments