ForwardDiff.jl icon indicating copy to clipboard operation
ForwardDiff.jl copied to clipboard

Plans for supporting Laplacian operator?

Open stefabat opened this issue 4 years ago • 2 comments

Is there any plan to support the Laplacian out of the box? I like the simplicity of ForwardDiff.jl, but right now I am defining the Laplacian operator as

laplacian(f::Function, r) = sum(diag(ForwardDiff.hessian(f,r)))

which is certainly not optimal.

stefabat avatar Jan 15 '21 19:01 stefabat

I recently started playing recently with hand-coding laplacians for chains. Very painful... I think this would be a fantastic addition for ForwardDiff.

Closely related is of course the divergence operator. In fact, having an optimized divergence might already give an optimized laplacian? It would also give us the option to compute the laplacian from a gradient that might be given through backward diff?

cortner avatar Apr 20 '23 17:04 cortner

After teaching myself about Dual and HyperDual numbers, I wrote this little test:

using ForwardDiff, LinearAlgebra, HyperDualNumbers


F(xx) = sum( (xx[i] - xx[j])^2 / (1 + xx[k]^2)
            for i in 1:length(xx)
               for j in i+1:length(xx)
                  for k = 1:length(xx)
            )

function laplacian(F, xx::AbstractVector{T}) where {T}
   # assume output type of F is also T
   Δ = zero(T)
   for i = 1:length(xx)
      hxx = [ Hyper(xx[j], j==i, j==i, 0) for j = 1:length(xx) ]
      Δ += F(hxx).epsilon12
   end
   return Δ
end

N = 5
xx = randn(N)

L1 = laplacian(F, xx)
L2 = tr(ForwardDiff.hessian(F, xx))

@show L1 - L2

Is this the recommended way to do it (modulo performance improvements of course...)? Is there an advantage using ForwardDiff vs HyperDualNumbers or vice-versa? Performance? How would I even write this using ForwardDiff? I got a bit confused. Any comments and/or help would be appreciated.

I'm also open to starting a PR to ForwardDiff if I can get some guidance.

cortner avatar May 19 '23 22:05 cortner