DiffEqOperators.jl icon indicating copy to clipboard operation
DiffEqOperators.jl copied to clipboard

Rename the package?

Open ChrisRackauckas opened this issue 3 years ago • 12 comments

It no longer (and hasn't for a long time) been about defining a DiffEqOperators interface. It's a package for finite difference discretizations of PDEs. What's a good name? Could be like PDEFiniteDiff.jl, or fancy like Fornberg.jl

ChrisRackauckas avatar Mar 28 '21 11:03 ChrisRackauckas

I like PDEFiniteDiff.jl better but is the jacvec stuff limited to finite diff or more general?

valentinsulzer avatar Mar 28 '21 16:03 valentinsulzer

The jacvec stuff should move to DiffEqBase. This should be a solely finite difference PDE package once that's done.

ChrisRackauckas avatar Mar 28 '21 16:03 ChrisRackauckas

The jacvec stuff should move to DiffEqBase. This should be a solely finite difference PDE package once that's done.

I wonder if it would not be cooler to include more operators later.. maybe global-representation of functions.. like spectral differential operators (Fourier based / chebyshev differential matrices ). That way it wouldn't remain limited to finite-difference ?

yewalenikhil65 avatar Apr 28 '21 13:04 yewalenikhil65

That would just be a different package. This package is so fully focused on finite difference stuff that those pieces wouldn't make sense here.

ChrisRackauckas avatar Apr 28 '21 18:04 ChrisRackauckas

Curious about the status and plans here. I'm getting the impression that this package is somewhat superseded, perhaps bordering on deprecated (and not just incomplete as the warning in the documentation states---btw. would be great to have a similar warning in the readme). However, this seems to be the package that comes closest to what I need at the moment, which is not at all related to PDEs, but needs lazy composition/combination of both array-based and function-based operators for the jacobian. In particular, I have terms that are very low rank (but not sparse) and should be specified as, say, an N x 2 times 2 x N composition. LinearMaps works in a pinch but doesn't have built-in support for state dependence/coefficient updating, and there's not even a workaround if you wanted to update the scalar in a ScaledMap.

What's still missing for my use case is a non-square jacvec operator (for the 2 x N factor), ref. JuliaDiff/SparseDiffTools.jl#184

danielwe avatar May 17 '22 17:05 danielwe

This being superseded is a somewhat new phenomenon, and has kind of crept up on us. I still think that the tools here have some merit, but I think that its scope may need to be reduced to be more maintainable. It was mentioned above that the jacvec stuff would be moved to DiffEqBase, is that still planned @ChrisRackauckas?

xtalax avatar May 17 '22 20:05 xtalax

Weakly held opinions:

  • Push to have the community (and not just SciML) consolidate around one canonical package for lazy linear map composition. To me, LinearMaps looks like the best candidate, but it needs to support the update mechanism that DiffEqOperators offers, i.e., registering update functions on the leaf operators and calling them recursively from the root. It would also be useful if function maps allowed extra arguments stored on the object (closures break updatability; requiring that the user define a callable type is fine but less friendly).
  • Upstream jacvec and hesvec functions to ForwardDiff, based on the ones in SparseDiffTools.
  • DiffEqBase could then define a JacVec operator that's just a LinearMaps.FunctionMap wrapping the jacvec function from ForwardDiff (and a VecJac that wraps a Zygote pullback). But this isn't crucial if the pieces already exist and work well in their respective packages. Simply having a tutorial example that shows how it can be done may be sufficient.
  • I don't know anything about the PDE discretization stuff but sounds like MethodOfLines is the final destination for that.

danielwe avatar May 17 '22 21:05 danielwe

As for point 1, the operators defined here have locally acting stencils, and as such can be composed to multi dimensional GPU convolutions leveraging NNLib - That is the key feature of merit that this package implements, I don't know if LinearMaps also does this however.

As for the rest, I'm not particularly versed in VecJac/JacVec so can't really comment.

xtalax avatar May 17 '22 21:05 xtalax

I see! I don't know exactly what that means but it sounds plausible that LinearMaps won't support that any time soon.

I guess I'm at the other end of the spectrum, I landed here while searching for updatable but otherwise straightforward lazy operator compositions and a non-square in-place jacvec operator. There are a couple of combinations of packages that get me 85 % of the way, and I'm happy to contribute to getting all the way there but unsure where my efforts are best spent.

danielwe avatar May 17 '22 21:05 danielwe

Well, there's a lot to say here. Good points, good opinions.

Push to have the community (and not just SciML) consolidate around one canonical package for lazy linear map composition. To me, LinearMaps looks like the best candidate, but it needs to support the update mechanism that DiffEqOperators offers, i.e., registering update functions on the leaf operators and calling them recursively from the root. It would also be useful if function maps allowed extra arguments stored on the object (closures break updatability; requiring that the user define a callable type is fine but less friendly).

Indeed, without the update functionality LinearMaps just cannot be used by the solvers. So the AbstractSciMLOperator interface is the one to use. We would need any other interface to at least be a superset of that, which nothing is, so that's the one we're going to use and push for.

With that, all of the interface is in SciMLBase, and the basic operators which are defined only using standard library tools should all be in SciMLBase. That's all almost completed, but there's two last operators to migrate:

https://github.com/SciML/DiffEqOperators.jl/blob/master/src/matrixfree_operators.jl https://github.com/SciML/DiffEqOperators.jl/blob/master/src/composite_operators.jl

Then we should define the operator interface here:

https://scimlbase.sciml.ai/dev/

Or, we should make a SciMLOperator package which is like LinearMaps in that it just defines the AbstractSciMLOperator for other things (such as SciMLBase) to use. Then it would have its own docs. I don't quite have the time for this right now, but if anyone wants to take up the charge this shouldn't take more than a few hours. I would be happy to do the domain for the docs, setup the CI, etc. and tag team it with someone though.

Upstream jacvec and hesvec functions to ForwardDiff, based on the ones in SparseDiffTools.

Indeed that would be good to do. Or, they should be in AbstractDifferentiation.jl, and every differentiation package should just extend that method set. That's our ultimate goal, but I wouldn't be surprised if that takes a year to giggle the interfaces around.

DiffEqBase could then define a JacVec operator that's just a LinearMaps.FunctionMap wrapping the jacvec function from ForwardDiff (and a VecJac that wraps a Zygote pullback). But this isn't crucial if the pieces already exist and work well in their respective packages. Simply having a tutorial example that shows how it can be done may be sufficient.

Note that the JacVec operator is now using SparseDiffTools, and if a Krylov method is used it's done automatically. So the one in DiffEqOperators is pretty much deprecated (like most things here), and there aren't many use cases where a user should need to directly use it themselves (though it would still be nice to expose, indeed at the DiffEqBase level).

I don't know anything about the PDE discretization stuff but sounds like MethodOfLines is the final destination for that.

It definitely is.

So then what is this repo? Pretty much, a dying corpse. Most operators should not have any dependencies and should be in a repo for people to extend. The other ones are for PDE stuff, but for PDE stuff it's more efficient to not use the operator form and instead fuse loops. So this repo is dying a slow death, and we should try to accelerate that death if we can.

ChrisRackauckas avatar May 20 '22 04:05 ChrisRackauckas

Thanks for the thorough rundown, super helpful! Gives me a better idea of how to get my stuff up and running in the present state. Hope to come back and help out with the transition once I've got things working.

I understand that you want most users to not have to interact directly with the operator facilities, but they are a fantastic tool to have available sometimes, so it would be great to have them exposed and explained in user-facing documentation eventually.

there aren't many use cases where a user should need to directly use it themselves

My example is a term like B * h(u, p, t) where B is a state-independent matrix (typically tall and narrow; the low rank is why this factorization of the term makes sense; think of it as a feedback control term for a high-dim system with only a few control channels). Using ForwardDiff to turn the entire term into a JVP-operator means propagating dual numbers through the matmul, which means using a generic implementation and doing redundant flops. Specifying the jacobian prototype by hand means you can use a non-square JVP operator for the h(u, p, t)-factor only, and compose it with B. Now you have a float matmul that hits BLAS. Of course, this is not strictly needed, it's just an optimization, but a pretty quick and straightforward one if a JVP operator and operator composition are easily accessible (and it's the reason I've been talking about the need for non-square JVP functions and operators).

(Not sure there's any need a for separate JVP operator type though? If a JVP function and a matrix free operator type both exist, seems like all you'd need is a helper function to wrap the latter around the former.)

danielwe avatar May 20 '22 05:05 danielwe

https://github.com/SciML/SciMLOperators.jl is chugging along.

ChrisRackauckas avatar Jun 03 '22 01:06 ChrisRackauckas