FEniCS.jl
FEniCS.jl copied to clipboard
get_array should return sparse matrix
At the moment, the get_array function returns a full Array{Float64,2}, which is not really a reasonable option for any practical use beyond toy problems. Can the get_array function be modified to return a SparseMatrixCSC{Float64,Int64} instead?
For reference, here's how I do the same on the FEniCS/SciPy interface:
import scipy.sparse as sp
parameters['linear_algebra_backend'] = 'Eigen'
def assemble_csr(a,bc=None):
A = assemble(a)
bc.apply(A)
row,col,val = as_backend_type(A).data()
return sp.csr_matrix((val,col,row))
I believe the same should work for FEniCS-to-Julia.
Yeah this is a more than reasonable request. Here it is in the code:
https://github.com/JuliaDiffEq/FEniCS.jl/blob/3c351b278eef5a710ce8fee3a3ba48508fce5119/src/jsolve.jl#L101-L119
I think it would be as simple as using sparse on simpler PyCall results to what you show there on the code right here:
https://github.com/JuliaDiffEq/FEniCS.jl/blob/3c351b278eef5a710ce8fee3a3ba48508fce5119/src/jsolve.jl#L103
Hello! Yes it should be sparse. This is due to https://github.com/JuliaPy/PyCall.jl/issues/204 . I have code locally that can convert it to sparse(by converting the PETSc backend matrices to Scipy and then Julia CSC format, I can open a PR with it in abit).
Exactly, the Julia equivalent to the last line is sparse(row,col,val). The trick is getting at the Eigen backend data using PyCall (note the updated snippet, I forgot that this doesn't work with the PETSc backend) -- I'm not familiar enough with the wrapping to comment on that. (I think this would be preferable to going via the Scipy route, which would add an unnecessary dependency.)
For PETSc matrices (assuming you do not want to (provide the option to) switch the backend to Eigen nor make use of the C++ PETSc interface), you could use a loop over all rows (A.size(1)) and use A.getrow(i) to build the row,col,val arrays. (See https://bitbucket.org/fenics-project/dolfin/pull-requests/45/implement-the-data-method-for-petscmatrix for the -- declined pull request that implements this in C++.)
Ideally, one should
- wrap
sizeandgetrowfor thePETScMatrix - implement the loop on the julia side to build
row,col,val - call
sparse
@clason thanks for that suggestion! I was currently doing the conversion in a very verbose way as I wasn't too familiar with the different CSR/CSC formats being used. I'll try and implement your suggestion. For reference, the code currently looks as
@pyimport scipy.sparse as sc
i,c,v = getValuesCSR(Pmat)
scsparse = sc.csr_matrix((v,c,i))
I,J,V = sc.find(scsparse)
return sparse(Int[i+1 for i in I], Int[j+1 for j in J],V) #0 based python to 1 based julia
I shall investigate how to efficiently set the backend to Eigen using PyCall then, and keep you informed. Edit : This turned out to be a 1-line command even in Julia, so I can change it
Is it difficult to implement this via dolfin.PETScMatrix.mat method which exposes the petsc4py object for which getValuesCSR can be called?
@MiroK I was not aware of the mat() function! Yes, if A is a dolfin.cpp.la.Matrix with PETSc backend,
row,col,val = as_backend_type(A).mat().getValuesCSR()
A_csr = sp.csr_matrix((val,col,row))
works nicely in Python. This should be the easiest way. Thanks!
@MiroK that was how the values where accessed in the snippet of code. The Pmat was an initialized PETSc matrix with the .mat method called. It still remains for me to see how to do the conversion from the CSR to CSC format efficiently without using Scipy if possible, so I'll have to read up on both implementations.
Ah, I see the problem -- unlike the Eigen data(), PETSc's getValuesCSR does not return the (row,col,val) structure but the actual ind,indptr,val structure, which Julia's sparse doesn't understand (yet -- I seem to remember a long discussion in some issue). I'm sure it's possible (and not too difficult, cf. https://github.com/scipy/scipy/blob/v1.1.0/scipy/sparse/csc.py#L176 and https://github.com/scipy/scipy/blob/92c0eccb0650f7f68b7b909a7b62e9656efaa4ac/scipy/sparse/sparsetools/csr.h#L66) to implement the csc to coo conversion in Julia, but here's how you can do it via scipy:
using SparseArrays,PyCall
# create a demo matrix
@pyimport fenics as fe
mesh = fe.UnitSquareMesh(64,64)
V = fe.FunctionSpace(mesh,"CG",1)
u,v = fe.TrialFunction(V),fe.TestFunction(V)
a = fe.dot(fe.grad(u),fe.grad(v))*fe.dx
PETScMat = fe.assemble(a)
# convert PETSc to Python
indptr,indices,vals = fe.as_backend_type(PETScMat)[:mat]()[:getValuesCSR]()
@pyimport scipy.sparse as sp
PyMat = sp.csr_matrix((vals,indices,indptr))[:tocoo]() # yields COO matrix
# convert Python to Julia
JuliaMat = sparse(PyMat[:row].+1,PyMat[:col].+1,PyMat[:data])
I stand corrected; it's not in the docs, but SparseMatrixCSC does have a constructor for the standard structure. So you don't need SciPy at all and can instead just do
using SparseArrays,PyCall
# create a demo matrix
@pyimport fenics as fe
mesh = fe.UnitSquareMesh(64,64)
V = fe.FunctionSpace(mesh,"CG",1)
u,v = fe.TrialFunction(V),fe.TestFunction(V)
a = fe.dot(fe.grad(u),fe.grad(v))*fe.dx + fe.Dx(u,1)*fe.dx # non-symmetric form
A_fe = fe.assemble(a) # wrapper for dolfin matrix
# convert to PETSc matrix, accessible via petsc4py
A_py = fe.as_backend_type(FEniCSMat)[:mat]()
# convert to Julia matrix -- note that we feed the CSR structure to a CSC constructor
m,n = A_py[:getSize]()
indptr,indices,data = A_py[:getValuesCSR]()
A_jl = SparseMatrixCSC(m,n,indptr.+1,indices.+1,data)
# transpose to get correct structure (note: "SparseMatrixCSC" is much more efficient than "sparse")
A_jl = SparseMatrixCSC(A_jl')
Problem solved 😊
Update: Tested and works for non-symmetric matrices now. @ChrisRackauckas I can create a PR to change the get_array (or create a new get_sparray method if you wish).