You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### How to use autograd with Gauss–Newton (Enzyme.jl)
355
+
356
+
You do not need to hand-derive gradients or Hessians to use Gauss–Newton. With `Enzyme.jl`, you can automatically obtain both and use them through the same in-place API shown above. In practice, this is typically faster and yields more stable estimates than naïve manual derivatives. `Enzyme.jl` has some sharp edges; please consult the [Enzyme documentation](https://enzymejs.github.io/enzyme/) before use.
357
+
358
+
```@example gaussnewton
359
+
using Enzyme
360
+
using BenchmarkTools
361
+
362
+
# 10) Define the log-posterior for logistic regression with a standard normal prior
363
+
function obj(β::AbstractVector, X::AbstractMatrix, y::AbstractVector)
364
+
Xβ = X * β
365
+
return mean(y .* Xβ .- log.(1 .+ exp.(Xβ)))
366
+
end
367
+
368
+
# Reverse-mode gradient and forward-over-reverse Hessian via Enzyme
On typical runs we observe a substantial speedup (often around 10×) for Enzyme while maintaining the same result.
414
+
361
415
### Projection with samples
362
416
363
417
The projection can be done given a set of samples instead of the function directly. For example, let's project an set of samples onto a Beta distribution:
0 commit comments