FiniteDifferences.jl
FiniteDifferences.jl copied to clipboard
Cross derivatives of a multivariate function?
Hello,
I hope all is well. I have a question about the title. I am interested in computing a $M\times N \times N$ three-way array of derivatives:
$\frac{\partial^{2}f_{i}}{\partial x_{j}\partial x_{k}}$
where $f$ is a $M$ vector of functions $f_i \ (i=1,\ldots,M)$, and each $f_i$ takes $x \in \mathbb{R}^N$ as its arguments. What is the best way to compute this object in FiniteDifferences.jl?
This is clearly not a Jacobian, but I tried to do jacobian(central_fdm(5,2), myfun, myval) just to what it gives. It did not give an error or an $M\times N \times N$ array but an $M\times N$ array. So what is it supposed to compute?
Best, Daisuke
Hey @daisukeadachi,
If jacobian receives a matrix-valued function, it will vectorise the output before computing the jacobian, which will then be a matrix. I think this example should be what you're after
using LinearAlgebra
using FiniteDifferences
m = central_fdm(5, 1)
# Don't use adaptation for the inner method.
m_no_adapt = central_fdm(5, 1, adapt=0)
f(x) = [dot(x, x), 2dot(x, x)]
d = jacobian(m, x -> jacobian(m_no_adapt, f, x)[1], x)[1]
d = reshape(d, 2, 3, 3) # Undo the vectorisation. The two outputs come first.
# Hessian for the first output:
@assert isapprox(d[1, :, :], 2I)
# Hessian for the second output:
@assert isapprox(d[2, :, :], 4I)
Does this generalise straightforwardly to higher dimensions? E.g. I have a 5x5x5 array, where each entry depends on 5 parameters and I want to differentiate it w.r.t. those 5 parameters to get a 5x5x5x5 array, where e.g. the derivative w.r.t. the first parameter is stored in the first sub-array :
d[ :, :, :, 1]
etc.
I would assume this is achieved in the same way, i.e.
fun #function depending on 5 parameters with 5x5x5 array as output
d = jacobian(central_fdm(5,1), fun, val)
d = reshape(d, 5, 5, 5, 5)
I am unsure because I fear that the indices could get all messed up in the process of vectorising and reshaping at higher dimensions.
Hey @paulc-git! I believe this should readily generalise to higher dimensions in the way you describe. You’re right that it’s not obvious whether the indexing just works out, but I believe that it should.