In a discussion with my friend @skieninger about the speed of sparse linear solvers of Matlab versus Python for solving a large system of equation (around 1e6) , we compared the backslash solver of Matlab with scipy's SuperLU, we observed a huge difference (with Matlab much faster), which is due to the fact that suitesparse and its poly algorithm (umfpack) is not installed by default in scipy. A package is available here but is a pain to install on Windows and even on WSL (or I had difficulties getting it to work easily with all the recent changes in numpy and scipy). When it works though, it makes a huge difference. For me, Pypardiso also did not work either with system and memory errors. I also noticed that switching between the different existing scipy solvers can also make a significant difference.
The problem is very well documented here.
To address this problem, I suggest the following list of tasks/objectives:
- A notebook to introduce some of the solvers for large problems
- Describing a workflow for iterative solvers and preconditioners
- Possibly adding more sparse solvers that are more efficient than the default scipy solver (provided as optional packages)
Here's the script I am using to test the solvers:
import pyfvtool as pf
import numpy as np
import time
start_time = time.time()
# Solving a 1D diffusion equation with a fixed concentration
# at the left boundary and a closed boundary on the right side
# Calculation parameters
Nx = 20 # number of finite volume cells
Lx = 1.0 # [m] length of the domain
c_left = 1.0 # left boundary concentration
c_init = 0.0 # initial concentration
D_val = 1e-5 # diffusion coefficient (gas phase)
t_simulation = 120.0 # [s] simulation time
dt = 60.0 # [s] time step
# Define mesh
mesh = pf.Grid3D(Nx, Nx, Nx, Lx, Lx, Lx)
# Create a cell variable with initial concentration
# By default, 'no flux' boundary conditions are applied
c = pf.CellVariable(mesh, c_init)
# Switch the left boundary to Dirichlet: fixed concentration
c.BCs.left.a[:] = 0.0
c.BCs.left.b[:] = 1.0
c.BCs.left.c[:] = c_left
c.apply_BCs()
# Assign diffusivity to cells
D_cell = pf.CellVariable(mesh, D_val)
D_face = pf.geometricMean(D_cell) # average value of diffusivity at the interfaces between cells
Mt, RHSt = pf.transientTerm(c, dt, 1.0)
Mbc, RHSbc = pf.boundaryConditionsTerm(c.BCs)
Md = pf.diffusionTerm(D_face)
M = Mt - Md + Mbc
RHS = RHSt + RHSbc
c_new = pf.solveMatrixPDE(mesh, M, RHS)
# Print the execution time
print("--- %s seconds ---" % (time.time() - start_time))
In a discussion with my friend @skieninger about the speed of sparse linear solvers of Matlab versus Python for solving a large system of equation (around 1e6) , we compared the backslash solver of Matlab with scipy's SuperLU, we observed a huge difference (with Matlab much faster), which is due to the fact that
suitesparseand its poly algorithm (umfpack) is not installed by default in scipy. A package is available here but is a pain to install on Windows and even on WSL (or I had difficulties getting it to work easily with all the recent changes innumpyand scipy). When it works though, it makes a huge difference. For me,Pypardisoalso did not work either with system and memory errors. I also noticed that switching between the different existingscipysolvers can also make a significant difference.The problem is very well documented here.
To address this problem, I suggest the following list of tasks/objectives:
Here's the script I am using to test the solvers: