Skip to content

Commit cafa91a

Browse files
authored
Merge pull request #75 from acgetchell/release/v0.4.0
chore(release): release v0.4.0
2 parents 3951280 + 843bb3c commit cafa91a

8 files changed

Lines changed: 101 additions & 99 deletions

File tree

CHANGELOG.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
66
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
77

8-
## [Unreleased]
8+
## [0.4.0] - 2026-04-11
99

1010
### Added
1111

@@ -18,6 +18,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
1818
- Refine benchmark comparison reporting and documentation [`e1b5955`](https://github.com/acgetchell/la-stack/commit/e1b5955fb5024232e34e9df9701bb24fb98efa15)
1919
- Expand test coverage for benchmark comparison edge cases [`bced7d9`](https://github.com/acgetchell/la-stack/commit/bced7d988bd2f42cb6bb5af9c54dabdcf787a5fc)
2020
- Update documentation and tests for integer-only Bareiss [`2ee3f05`](https://github.com/acgetchell/la-stack/commit/2ee3f05caecfdf1a23b61257a7465b3bb6d63614)
21+
- Restrict benchmark baselines to main and improve reporting [`9a7caa2`](https://github.com/acgetchell/la-stack/commit/9a7caa241b5f476c2659772f3189468b10fcba2e)
2122

2223
### Maintenance
2324

@@ -33,6 +34,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
3334
- Bump codecov/codecov-action from 5.5.3 to 6.0.0 [`b3e1380`](https://github.com/acgetchell/la-stack/commit/b3e1380e0b8df85478648036eeb1d9b2c79aaac5)
3435
- Bump taiki-e/install-action from 2.70.1 to 2.73.0 [`7de720d`](https://github.com/acgetchell/la-stack/commit/7de720dfb8328d01843cfabd15d23086ee98832b)
3536
- Bump astral-sh/setup-uv from 7.6.0 to 8.0.0 [`af40753`](https://github.com/acgetchell/la-stack/commit/af40753130fac56d48da7fce2f18b11dc391ebe6)
37+
- Add performance regression detection for exact-arithmetic benchmarks [`44bce99`](https://github.com/acgetchell/la-stack/commit/44bce99bcae8f852dbf7500e71fa182730e08bca)
3638

3739
### Performance
3840

@@ -226,7 +228,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
226228

227229
- Add tarpaulin coverage upload [`7486dfd`](https://github.com/acgetchell/la-stack/commit/7486dfd54e16a6dbde41575c3f35a1acb65f57d2)
228230

229-
[unreleased]: https://github.com/acgetchell/la-stack/compare/v0.3.0..HEAD
231+
[0.4.0]: https://github.com/acgetchell/la-stack/compare/v0.3.0..v0.4.0
230232
[0.3.0]: https://github.com/acgetchell/la-stack/compare/v0.2.2..v0.3.0
231233
[0.2.2]: https://github.com/acgetchell/la-stack/compare/v0.2.1..v0.2.2
232234
[0.2.1]: https://github.com/acgetchell/la-stack/compare/v0.2.0..v0.2.1

CITATION.cff

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ cff-version: 1.2.0
22
message: "If you use this software, please cite it as below."
33
type: software
44
title: "la-stack: Fast, stack-allocated linear algebra for fixed dimensions in Rust"
5-
version: 0.3.0
5+
version: 0.4.0
66
url: "https://github.com/acgetchell/la-stack"
77
repository-code: "https://github.com/acgetchell/la-stack"
88
identifiers:

Cargo.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

Cargo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[package]
22
name = "la-stack"
3-
version = "0.3.0"
3+
version = "0.4.0"
44
edition = "2024"
55
rust-version = "1.94"
66
license = "BSD-3-Clause"

README.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ Add this to your `Cargo.toml`:
5757

5858
```toml
5959
[dependencies]
60-
la-stack = "0.3.0"
60+
la-stack = "0.4.0"
6161
```
6262

6363
Solve a 5×5 system via LU:
@@ -143,7 +143,7 @@ rationals (this pulls in `num-bigint`, `num-rational`, and `num-traits` for
143143

144144
```toml
145145
[dependencies]
146-
la-stack = { version = "0.3.0", features = ["exact"] }
146+
la-stack = { version = "0.4.0", features = ["exact"] }
147147
```
148148

149149
**Determinants:**
@@ -289,14 +289,14 @@ Summary (median time; lower is better). The “la-stack vs nalgebra/faer” colu
289289
<!-- BENCH_TABLE:lu_solve:median:new:BEGIN -->
290290
| D | la-stack median (ns) | nalgebra median (ns) | faer median (ns) | la-stack vs nalgebra | la-stack vs faer |
291291
|---:|--------------------:|--------------------:|----------------:|---------------------:|----------------:|
292-
| 2 | 2.026 | 4.476 | 142.364 | +54.7% | +98.6% |
293-
| 3 | 15.718 | 23.857 | 191.028 | +34.1% | +91.8% |
294-
| 4 | 28.171 | 53.516 | 213.492 | +47.4% | +86.8% |
295-
| 5 | 47.595 | 72.861 | 287.763 | +34.7% | +83.5% |
296-
| 8 | 137.876 | 163.720 | 365.792 | +15.8% | +62.3% |
297-
| 16 | 609.456 | 594.194 | 910.985 | -2.6% | +33.1% |
298-
| 32 | 2,719.556 | 2,812.766 | 2,921.820 | +3.3% | +6.9% |
299-
| 64 | 17,776.557 | 14,083.938 | 12,541.345 | -26.2% | -41.7% |
292+
| 2 | 2.309 | 4.365 | 140.156 | +47.1% | +98.4% |
293+
| 3 | 18.331 | 22.706 | 181.074 | +19.3% | +89.9% |
294+
| 4 | 27.430 | 51.372 | 210.451 | +46.6% | +87.0% |
295+
| 5 | 53.819 | 70.722 | 276.064 | +23.9% | +80.5% |
296+
| 8 | 143.611 | 160.309 | 356.960 | +10.4% | +59.8% |
297+
| 16 | 611.393 | 580.793 | 871.704 | -5.3% | +29.9% |
298+
| 32 | 2,631.241 | 2,733.946 | 2,832.816 | +3.8% | +7.1% |
299+
| 64 | 17,233.345 | 14,112.678 | 12,164.571 | -22.1% | -41.7% |
300300
<!-- BENCH_TABLE:lu_solve:median:new:END -->
301301

302302
## 📄 License

docs/BENCHMARKING.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,10 +25,10 @@ just bench-vs-linalg
2525
just bench-exact
2626

2727
# Save an exact baseline (e.g., before optimising)
28-
just bench-save-baseline v0.3.0
28+
just bench-save-baseline v0.4.0
2929

3030
# Compare current code against a saved baseline
31-
just bench-compare v0.3.0
31+
just bench-compare v0.4.0
3232

3333
# Generate a snapshot without comparison
3434
just bench-compare
Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
D,la_stack,la_lo,la_hi,nalgebra,na_lo,na_hi,faer,fa_lo,fa_hi
2-
2,2.0258758718048133,2.014534795046898,2.0337088481055914,4.476274388530518,4.468773998658985,4.486731305906561,142.36367198250207,141.82715200931858,143.21787638322655
3-
3,15.717669371675765,15.648950929023092,15.867428712733322,23.857261749124255,23.694320255654922,24.088176511317513,191.02830230753565,189.6244976294505,192.02929880197323
4-
4,28.171349365792665,28.1324478448672,28.229709543836528,53.515728177148674,53.34916975176522,53.77942386273234,213.4924047917039,212.38023954262556,214.5020685579196
5-
5,47.59503548589087,47.38801322788408,47.79285706741585,72.86136096441832,72.71078395288922,73.06408244387191,287.76335568245673,286.6382644567461,290.2384197600831
6-
8,137.87610457779525,137.2567887868679,138.23988300398284,163.72019468107788,162.28590020683725,165.54345017655672,365.79150403431754,364.6016750345311,366.73479988263375
7-
16,609.4564776739355,607.5162485653386,612.3944602465123,594.1937570340338,593.0004570514468,596.1405861684806,910.9849864426476,908.2138488534533,915.6392660912747
8-
32,2719.555966413325,2713.00270671728,2725.4250499879836,2812.7660040983606,2808.242909028018,2819.1458333333335,2921.819801830421,2919.528526414588,2928.77003109304
9-
64,17776.557416267944,17736.584051724138,17819.410263618192,14083.937823189666,14046.268052019077,14108.37641453246,12541.345378151262,12524.776339285716,12565.923475609756
2+
2,2.309084094214292,2.303508072051772,2.320556696513248,4.364518872374484,4.358154936085015,4.36964789592874,140.15575262034162,139.53334588117198,140.4495844508752
3+
3,18.33097673394614,18.273359518783604,18.361403406211608,22.705897034612143,22.564093728463128,22.91072188569287,181.0737171931479,180.38682460200042,181.70129206459293
4+
4,27.430461775038097,27.41326480258085,27.462758331001304,51.37175309721522,51.29468685146327,51.477167992692316,210.45129217481627,210.11591462124417,211.12191423331922
5+
5,53.8187502446859,53.70850909928479,53.85944288329581,70.72209914126024,70.58461080060759,70.80642561266758,276.0642099442515,275.26718428973214,277.08908382769255
6+
8,143.61107101616628,143.08566824669077,144.23973111184426,160.3091383596412,157.74277868119285,162.4765631219522,356.9597107252403,356.09333193008604,357.7579293922214
7+
16,611.393474121212,607.0554516008083,614.0988895347266,580.7933953429945,578.7449824562907,581.0957836912207,871.7039621195331,869.4837470449172,873.6488504064271
8+
32,2631.241121426466,2626.669908635426,2635.1795337149238,2733.9455793946936,2732.3185346416067,2737.926103179753,2832.8157108843534,2829.2559523809523,2837.7406015037595
9+
64,17233.344659711875,17191.217013729976,17341.914285714287,14112.678414368063,13965.669014084508,14162.695618153364,12164.571092017737,12137.462174452254,12181.512446567765

0 commit comments

Comments
 (0)