We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent f81780a commit b5be280Copy full SHA for b5be280
1 file changed
README.md
@@ -13,7 +13,7 @@ differentiation (AD) provides a better foundation because it derives local
13
linearizations directly from the implementation whenever needed, so sensitivity
14
information automatically stays consistent with the code. Combined with random
15
sampling methods for strongly nonlinear behaviour, uncertainty propagation
16
-is defined in terms of algorithmically differentiable programs. AD frameworks
+can be defined in terms of algorithmically differentiable programs. AD frameworks
17
treat inputs, outputs, sensitivities, and uncertainties as dynamic tensor‑valued
18
objects rather than forcing the uncertainty calculus into a fixed set of
19
closed‑form formulas.
0 commit comments