pymor icon indicating copy to clipboard operation
pymor copied to clipboard

Add support for pydata/sparse arrays in NumpyMatrixOperator

Open pmli opened this issue 4 years ago • 8 comments

Currently, NumpyMatrixOperator supports NumPy arrays and SciPy sparse matrices. It would be good to also add support for pydata/sparse arrays, as SciPy did in version 1.4.0.

pmli avatar Aug 07 '20 13:08 pmli

Not sure how useful that would be. That library only seems to support coo and dok formats. So to use any linear solver, the matrices would have to be converted in the first place. Do you know anyone who uses this library, @pmli?

sdrave avatar Aug 17 '20 13:08 sdrave

Not sure how useful that would be. That library only seems to support coo and dok formats. So to use any linear solver, the matrices would have to be converted in the first place.

You're right, scipy.sparse.linalg.spsolve first converts it to a SciPy sparse matrix.

Do you know anyone who uses this library, @pmli?

I expect we will use it for tensors in quadratic operators (#304).

pmli avatar Aug 17 '20 14:08 pmli

The documentation does not really sound like this is geared at high-performance numerical linear algebra. Something like pytorch might be more useful when dealing with sparse tensors. (For something like BlockOperator pydata/sparse would be fine, I assume.)

sdrave avatar Aug 17 '20 16:08 sdrave

Using PyTorch for sparse tensors sounds a bit like overkill to me... From the documentation of torch.sparse, it seems PyTorch only supports the COO format. Also, there is a warning that the API might change.

pmli avatar Aug 19 '20 10:08 pmli

Using PyTorch for sparse tensors sounds a bit like overkill to me...

Why? As far as I understand it pytorch/tensorflow are basically tensor libraries with some support for automatic differentiation. We already have it as an optional dependency.

From the documentation of torch.sparse, it seems PyTorch only supports the COO format.

As does pydata/sparse (along with dok). What sparse format would you have in mind? (We are speaking of higher-order tensors, right?)

In any case, what I was trying to say is that pydata/sparse is probably not the right library for our applications.

sdrave avatar Aug 19 '20 12:08 sdrave

Using PyTorch for sparse tensors sounds a bit like overkill to me...

Why? As far as I understand it pytorch/tensorflow are basically tensor libraries with some support for automatic differentiation. We already have it as an optional dependency.

Hm, my impression is that PyTorch and TensorFlow are much bigger, e.g., installing PyTorch takes significantly more time (and memory) than numpy or scipy.

From the documentation of torch.sparse, it seems PyTorch only supports the COO format.

As does pydata/sparse (along with dok). What sparse format would you have in mind? (We are speaking of higher-order tensors, right?)

In any case, what I was trying to say is that pydata/sparse is probably not the right library for our applications.

What I was trying to ask is why do you think torch.sparse is better if it doesn't support better formats than pydata/sparse (some generalization of CSR/CSC)?

pmli avatar Aug 19 '20 15:08 pmli

What I was trying to ask is why do you think torch.sparse is better if it doesn't support better formats than pydata/sparse (some generalization of CSR/CSC)?

I am just assuming that the tensors of pytorch are more optimized towards linear algebra operators. My impression was, without really having looked into it, that pydata/sparse is more for 'analysis of sparse data'.

sdrave avatar Aug 26 '20 12:08 sdrave

I don't think we want to work on this any time soon, so I tagged it with the 'future' milestone.

sdrave avatar Oct 30 '20 08:10 sdrave