Hi guys, thanks for this library. I've been using it myself but I encountered a bug. Not sure if this is PETSc.jl's fault but I had Claude draft me an issue, I'm sure you'll be able to determine if this is really PETSc.jl's fault or if I am to blame!
PETSc module initialization occurs before MPI can be initialized, causing segfault
Description
When using PETSc.jl in an MPI environment, the module's __init__() function runs during using PETSc, which occurs before user code can call MPI.Init(). This causes segfaults because PETSc's initialization tries to access MPI functionality before MPI is properly initialized.
This is particularly problematic when PETSc.jl is loaded as a dependency of another package, as users have no control over when using PETSc executes.
Environment
- Julia version: 1.11.7
- PETSc.jl version: 0.3.1
- MPI.jl version: 0.20.23
- OS: macOS (Darwin 25.1.0)
Minimal Reproduction
File: test_petsc_segfault.jl
using MPI
# This is the problematic sequence:
# 1. Load PETSc BEFORE calling MPI.Init()
using PETSc # <-- PETSc.__init__() runs here, tries to access MPI
# 2. Initialize MPI (too late!)
if !MPI.Initialized()
MPI.Init()
end
# 3. Initialize PETSc
if !PETSc.initialized(PETSc.petsclibs[1])
PETSc.initialize()
end
println("If we got here, it worked!")
Run with:
mpiexec -n 2 julia test_petsc_segfault.jl
Expected result: Program runs without error
Actual result: Segmentation fault
Root Cause
The execution sequence is:
mpiexec launches Julia → MPI environment is set up by mpiexec
- User code:
using MPI
- User code:
using PETSc
- PETSc.jl's
__init__() runs here, potentially accessing MPI before it's initialized
- User code:
MPI.Init() is called (too late!)
- User code:
PETSc.initialize() is called
Current Workaround
The only workaround is to ensure MPI.Init() is called before using PETSc:
File: test_petsc_workaround.jl
using MPI
# WORKAROUND: Initialize MPI BEFORE loading PETSc
if !MPI.Initialized()
MPI.Init()
end
# Now safe to load PETSc
using PETSc
# Initialize PETSc
if !PETSc.initialized(PETSc.petsclibs[1])
PETSc.initialize()
end
println("Success with workaround!")
Impact
This issue makes it nearly impossible to use PETSc.jl as a dependency in other packages, because:
- When a user does
using MyPackage (which depends on PETSc.jl), the using PETSc happens during module loading
- This occurs before the user's test file can call
MPI.Init()
- Result: immediate segfault
Suggested Fix
PETSc.jl's __init__() function should avoid accessing MPI functionality, or should check if MPI is initialized first. The actual PETSc initialization should be deferred to an explicit call to PETSc.initialize(), which users already call after MPI.Init().
Alternatively, document that PETSc.jl should only be loaded after MPI.Init() has been called, though this would require significant restructuring of code that depends on PETSc.jl.
Hi guys, thanks for this library. I've been using it myself but I encountered a bug. Not sure if this is PETSc.jl's fault but I had Claude draft me an issue, I'm sure you'll be able to determine if this is really PETSc.jl's fault or if I am to blame!
PETSc module initialization occurs before MPI can be initialized, causing segfault
Description
When using PETSc.jl in an MPI environment, the module's
__init__()function runs duringusing PETSc, which occurs before user code can callMPI.Init(). This causes segfaults because PETSc's initialization tries to access MPI functionality before MPI is properly initialized.This is particularly problematic when PETSc.jl is loaded as a dependency of another package, as users have no control over when
using PETScexecutes.Environment
Minimal Reproduction
File:
test_petsc_segfault.jlRun with:
Expected result: Program runs without error
Actual result: Segmentation fault
Root Cause
The execution sequence is:
mpiexeclaunches Julia → MPI environment is set up by mpiexecusing MPIusing PETSc__init__()runs here, potentially accessing MPI before it's initializedMPI.Init()is called (too late!)PETSc.initialize()is calledCurrent Workaround
The only workaround is to ensure
MPI.Init()is called beforeusing PETSc:File:
test_petsc_workaround.jlImpact
This issue makes it nearly impossible to use PETSc.jl as a dependency in other packages, because:
using MyPackage(which depends on PETSc.jl), theusing PETSchappens during module loadingMPI.Init()Suggested Fix
PETSc.jl's
__init__()function should avoid accessing MPI functionality, or should check if MPI is initialized first. The actual PETSc initialization should be deferred to an explicit call toPETSc.initialize(), which users already call afterMPI.Init().Alternatively, document that PETSc.jl should only be loaded after
MPI.Init()has been called, though this would require significant restructuring of code that depends on PETSc.jl.