|
1 | 1 | --- |
2 | 2 | title: 'diffblas: algorithmically differentiated BLAS routines' |
3 | 3 | tags: |
| 4 | + - BLAS |
4 | 5 | - automatic differentiation |
5 | | - - linear algebra |
| 6 | + - numerical linear algebra |
| 7 | + - scientific computing |
6 | 8 | authors: |
7 | | - - name: Alexis Montoison^[corresponding author] |
8 | | - orcid: 0000-0002-3403-5450 |
9 | | - affiliation: 1 |
10 | | - - name: Sri Hari Krishna Narayanan |
| 9 | + - name: Sri Hari Krishna Narayanan^[corresponding author] |
11 | 10 | orcid: 0000-0003-0388-5943 |
12 | 11 | affiliation: 1 |
| 12 | + - name: Alexis Montoison |
| 13 | + orcid: 0000-0002-3403-5450 |
| 14 | + affiliation: 1 |
13 | 15 | - name: Jean-Luc Bouchot |
14 | 16 | orcid: 0000-0003-4523-3986 |
15 | 17 | affiliation: 2 |
16 | 18 | affiliations: |
17 | | - - name: Argonne National Laboratory, Lemont, IL, USA. |
| 19 | + - name: Mathematics and Computer Science Division, Argonne National Laboratory, USA |
18 | 20 | index: 1 |
19 | | - - name: Inria de Saclay, Palaiseau, France. |
| 21 | + - name: Inria de Saclay, Palaiseau, France |
20 | 22 | index: 2 |
21 | | -date: 6 February 2026 |
| 23 | +date: 22 February 2026 |
22 | 24 | bibliography: paper.bib |
23 | 25 |
|
24 | 26 | --- |
25 | 27 |
|
26 | 28 | # Summary |
27 | 29 |
|
28 | | -`diffblas`is a library that provides (algorithmically) differentiated BLAS routines from their reference implementation in LAPACK using the automatic differentiation tool Tapenade [@tapenade] in four modes: forward (`_d`), vector forward (`_dv`), reverse (`_b`), and vector reverse (`_bv`). |
| 30 | +`diffblas` is a library that provides algorithmically differentiated [@griewank2008] BLAS routines from their reference implementations in [LAPACK](https://github.com/Reference-LAPACK/lapack) on GitHub using the automatic differentiation tool Tapenade [@tapenade]. |
| 31 | +It supports four modes: forward (`_d`), vector forward (`_dv`), reverse (`_b`), and vector reverse (`_bv`). |
| 32 | + |
| 33 | +In addition to differentiating the standard Fortran-style `BLAS` interface, `diffblas` also provides differentiated `CBLAS` routines, facilitating interoperability with C and other languages. |
| 34 | +Its API mirrors BLAS/CBLAS, with additional arguments specifying differentiation variables, making it straightforward to integrate into existing workflows. |
| 35 | + |
| 36 | +`diffblas` calls the underlying standard `BLAS `implementation, and is agnostic to the backend (OpenBLAS, BLIS, MKL, Apple Accelerate), ensuring both performance and portability. |
| 37 | + |
| 38 | +By providing accurate and efficient derivatives of linear algebra operations, `diffblas` facilitates gradient-based optimization, sensitivity analysis, and a wide range of scientific computing applications |
29 | 39 |
|
30 | 40 | # Statement of need |
31 | 41 |
|
| 42 | +Linear algebra routines such as those in LAPACK are fundamental to scientific computing, optimization, and machine learning. |
| 43 | +However, they do not provide derivatives, which are often required for gradient-based algorithms. |
| 44 | +Current approaches either rely on hand-coded derivatives or generic automatic differentiation applied to high-level code, which can be inefficient or error-prone. |
| 45 | +`diffblas` addresses this gap by providing algorithmically differentiated BLAS routines directly from reference LAPACK implementations and following relevant differntiation rules [@giles2008]. |
| 46 | +This enables accurate and efficient computation of derivatives while preserving compatibility with existing BLAS-based codes. |
| 47 | + |
| 48 | +# State of the field |
| 49 | + |
| 50 | +Automatic differentiation (AD) tools such as Tapenade [@tapenade], ADOL-C [@ADOLC], or Taf provide general mechanisms to compute derivatives of code. |
| 51 | +However, applying AD naively to low-level BLAS or LAPACK routines can be inefficient due to loops, memory layout, and caching issues [@jonasson2020]. |
| 52 | +Specialized libraries like diffblas that generate differentiated routines directly from reference implementations combine the reliability of LAPACK with the efficiency of AD, bridging a gap in current scientific computing workflows. |
| 53 | + |
| 54 | +# Research impact statement |
| 55 | + |
| 56 | +This work was inspired in part by a need to differentiate a Fortran code [@HFBTHO] that uses BLAS and LAPACK routines, and to use the differentiated application for gradient-based optimization. |
| 57 | + |
| 58 | +Providing both the standard and CBLAS interfaces ensures that diffblas can be adopted across different programming environments, facilitating derivative computations in diverse scientific computing projects. |
| 59 | +Precompiled artifacts on GitHub further simplify integration, enabling rapid deployment in multiple languages and scientific computing projects. |
| 60 | + |
32 | 61 | # Acknowledgements |
33 | 62 |
|
34 | 63 | This work was supported in part by the Applied Mathematics activity within the U.S. Department of Energy, Office of Science, Office |
35 | 64 | of Advanced Scientific Computing Research Applied Mathematics, and Office of Nuclear Physics SciDAC program under Contract No. DE-AC02-06CH11357. This work was supported in part by NSF CSSI grant 2104068. |
36 | 65 |
|
| 66 | +# AI usage disclosure |
| 67 | + |
| 68 | +Generative AI was used to ... |
| 69 | +ChatGPT was used to check spelling, grammar, and clarity of the English text in this paper. |
| 70 | + |
37 | 71 | # References |
0 commit comments