ReverseDiff.jl
ReverseDiff.jl copied to clipboard
mean BigFloat precision
I observed a loss of precision using ReverseDiff
with respect to ForwardDiff
on the following simple example:
julia> testprecision()
type precision eps(T) = 1.7272337110188889250772703725600799142232e-77
ForwardDiff error = 0.0000000000000000000000000000000000000000e+00
ReverseDiff error = 8.6560832651618551196119417711416155844212e-19
Is it expected? Thanks!
using Statistics, LinearAlgebra
using ForwardDiff, ReverseDiff, Printf
setprecision(2 ^ 8)
function testprecision(n::Int64 = 1_000, T::DataType = BigFloat)
x = rand(T, n)
f(x) = mean(x .^ 3)
gf = ForwardDiff.gradient(f, x)
gr = ReverseDiff.gradient(f, x)
gth = 3 * x .^ 2 / n
@printf "type precision eps(T) = %4.40e \n\n" eps(T)
@printf "ForwardDiff error = %4.40e \n" norm(gf - gth)
@printf "ReverseDiff error = %4.40e \n" norm(gr - gth)
nothing
end