ForwardDiff.jl
ForwardDiff.jl copied to clipboard
type instabilities when passing a function to jacobian!
Hi, I am encountering type instabilities whan calling jacobian!. As I use this in the inner loop of a finite volume operator assembly, this is serious performatnce hit.
A remedy could be to type-annotate the callback f!. Here is an MWE. myjacobian!just modifiesjacobian!`
(see
https://github.com/JuliaDiff/ForwardDiff.jl/blob/909976d719fdbd5fec91a159c6e2d808c45a770f/src/jacobian.jl#L75 )
in this respect..
using ForwardDiff
using DiffResults
function myjacobian!(result::Union{AbstractArray,DiffResults.DiffResult}, f!::F, y::AbstractArray, x::AbstractArray, cfg::ForwardDiff.JacobianConfig{T} = ForwardDiff.JacobianConfig(f!, y, x), ::Val{CHK}=Val{true}()) where {F,T,CHK}
CHK && ForwardDiff.checktag(T, f!, x)
if ForwardDiff.chunksize(cfg) == length(x)
ForwardDiff.vector_mode_jacobian!(result, f!, y, x, cfg)
else
ForwardDiff.chunk_mode_jacobian!(result, f!, y, x, cfg)
end
return result
end
function test()
u=[1,2.0]
function f(y,u)
y[1]=2*u[1]-u[2]
y[2]= u[2]
end
result=DiffResults.JacobianResult(u)
y=similar(u)
cfg=ForwardDiff.JacobianConfig(f, y,u)
@time ForwardDiff.jacobian!(result,f,y,u,cfg)
@time myjacobian!(result,f,y,u,cfg)
DiffResults.jacobian(result)
end
The output of the second run is:
0.000010 seconds (1 allocation: 64 bytes)
0.000003 seconds
2×2 Matrix{Float64}:
2.0 -1.0
0.0 1.0
vector_mode_jacobian has this type annotation (and I use this in the moment as a drop-in replacement,
though it is not exported).
May be this needs to be changed for all similar API functions. Not sure however where this can have unintended consequences.
bump this - running into need for occasional dispatch to chunk_mode_jacobian now
A workaround for this is to create the JacobianConfig in a type stable manner:
cfg=ForwardDiff.JacobianConfig(f, y,u, ForwardDiff.Chunk{2}())
julia> test()
0.000002 seconds
0.000001 seconds
I thought I encountered a similar problem with gradient()
However in my case I don't construct the GradientConfig explicitly and j-fu's hotfix (applied to gradient()).
Shouldn't a simple gradient(f,x) be inferrable as well?
I worked around it using x::StaticVector as the argument, then everything gets inferred correctly.
Looking into gradient.jl however, I noted that gradient!() already specializes on f::F (but only the mutating form), so there is some inconsistency here.