feat: add/fix PETSc KSP solver wrapper#894
Conversation
|
@ChrisRackauckas I think this is ready for review but nothing urgent. Everything works for dense and sparse matrices with all the solvers and preconditionners, and with support for null-space. I dropped the MPI support for now because the PR was getting too big to review, so it only works with 1 process for now. I am not sure anyway LinearSolve.jl is fit for that. The big limitation currently is that PETSc expect sparse matrices with CSR format, whereas Julia is using CSC. So, when reusing the cache for large scale problems, the time is spent on converting from CSR to CSC format. That is also what is done currently in the PETSc.jl package when initialising the first time using (see here). I could not come up with something cleaner, but happy to take suggestions. That currently makes it hard to use it as a stand alone for a non-linear solver where we want to reuse the same cache, and people should just use SNES for that I guess. The PR needs JuliaParallel/PETSc.jl#227 so we need at least 0.4.6 for PETSc.jl. |
It is. We have support in HYPRE https://github.com/SciML/LinearSolve.jl/blob/main/test/hypretests_mpi.jl, but agreed it probably works better as a separate PR as this is getting big.
Is there no SparseMatrixCSR that a user could define? I guess not. For now, let's get this in for correctness first. But yeah for benchmarking we probably want to have both versions of A and benchmark using the preferred form.
SNES benchmarks pretty poorly though 😅 hence why we really wanted the linear solvers directly. |
|
Yeah oops haha, I had saw "oh all the tests passed, wild" but I guess that's because I forgot to add a test group for it 😅 😅 😅 😅 😅 😅 😅 😅 😅 😅 😅 😅 😅 😅 |
|
Alright, that explains how the original PR was merged haha
There is no default SparseMatrixCSR no. There is https://github.com/gridap/SparseMatricesCSR.jl, which I believe was developed specifically to work with PETSc for https://github.com/gridap/GridapPETSc.jl. Maybe we could force the users to use that one, or at least dispatch on it to allow its usage. |
|
We can at least start here.
It might at least be good to support that as an input, so the benchmarking is more fair. |
Add ksp() support from PETSc.jl with the new version of the wrapper (0.4.x, JuliaParallel/PETSc.jl#215). Closes #8. I believe Claude was mostly hallucinating, because this PR #873 was not working. On Master, this is failing:
but works with this PR.
What is currently supported
transposed = trueksp_options::NamedTuple(for instance ksp_options = (ksp_monitor = "", ksp_view = "")).What I am not sure/not happy with:
PETScExt.cleanup_petsc_cache!(cache)to clear it after usage (works onsolve()orinteg()object). Most people will clear the memory by closing the Julia session, but I think it makes sense to expose such a function.Currently, the MPI implementation is a bit clunky/naive because every rank holds the full Julia matrix in memory and inserts its locally owned rows into PETSc. I could not come up with a better way because of the way LinearSolve.jl expect the input from the users. I am happy to remove the MPI support if that is not satisfying enough. Also needs some proper benchmarks.I have removed the MPI support for more processes than 1 because the PR was getting too big.The use of PETSc in general is mostly based on Sparse matrix for large system. Because of that, the handling of for dense matrices is a bit clunky and I could not make the reuse of ksp() work with how it is communicating with PETSc, and it currently has to rebuild it everytimeThis works well for dense matrix nowsolve!()is called again. However, this works correctly for sparse matrix. Could potentially be improved in another PR.Not sure where to put the test files. There isRemovedpetsctests_mpi.jlandpetsctests.jlright now.petsctests_mpi.jlfor nowChecklist
contributor guidelines, in particular the SciML Style Guide and
COLPRAC.
AI Disclosure: Claude was used to help writting the code and the comments + docstrings