Skip to content

Commit 0601bd7

Browse files
Enhance paper with derivative approaches and references
Added references to existing derivative approaches and their efficiency in linear algebra routines.
1 parent fdab3b2 commit 0601bd7

1 file changed

Lines changed: 4 additions & 2 deletions

File tree

paper/paper.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -42,13 +42,15 @@ By providing efficient and accurate derivatives of linear algebra operations, `d
4242

4343
Linear algebra routines such as those in LAPACK are widely used in scientific computing, optimization, and machine learning. However, they do not provide derivatives, which are often required for gradient-based algorithms.
4444

45-
Existing approaches rely on hand-coded derivatives or generic automatic differentiation applied to high-level code, which can be inefficient or error-prone [@jonasson2020].
46-
4745
`diffblas` addresses this gap by providing algorithmically differentiated BLAS routines directly from reference LAPACK implementations and following relevant differntiation rules [@giles2008].
4846

4947
# State of the field
5048

5149
Automatic source-to-source differentiation tools, such as Tapenade [@tapenade], ADOL-C [@ADOLC], or TAF [@TAF], provide general mechanisms to compute derivatives of code.
50+
51+
[@jonasson2020] derives scalar reverse mode derivative formulae for BLAS routines and provides their Fortran code. Such an approach is generally more efficient that differentiating through the BLAS routines using an AD tool [@jonasson2020]. They do not however consider the forward mode or vector modes.
52+
53+
Derivatives of linear algebra routines are available in Python packages such as JAX [@jax] and PyTorch [@pytorch]. These implementations are not available outside these frameworks. Enzyme and Enzyme.jl perform optimized BLAS and CuBLAS differentiation at the MLIR level[@Enzyme] but the derivative code is not available externally.
5254
...
5355

5456
# Research impact statement

0 commit comments

Comments
 (0)