DoubleFloats.jl
DoubleFloats.jl copied to clipboard
Float64, Double64 and BigFloat behave differently in dot products
The following results are occasionally obtained for randomly generated vectors:
julia> using DoubleFloats
julia> n = 10;
julia> Ty = Double64; c = rand(Ty,n)+im*rand(Ty,n); d = rand(Ty,n)+im*rand(Ty,n); c'*d-conj(d'*c)
0.0 - 1.5407439555097887e-33im
julia> Ty = Double64; c = rand(Ty,n)+im*rand(Ty,n); d = rand(Ty,n)+im*rand(Ty,n); c'*d-conj(d'*c)
-4.930380657631324e-32 + 3.0814879110195774e-33im
julia> Ty = Double64; c = rand(Ty,n); d = rand(Ty,n); c'*d-d'*c
-2.465190328815662e-32
The discrepancies never occur for other type of floating point data:
julia> Ty = Float64; c = rand(Ty,n)+im*rand(Ty,n); d = rand(Ty,n)+im*rand(Ty,n); c'*d-conj(d'*c)
0.0 + 0.0im
julia> Ty = BigFloat; c = rand(Ty,n)+im*rand(Ty,n); d = rand(Ty,n)+im*rand(Ty,n); c'*d-conj(d'*c)
0.0 + 0.0im
julia> Ty = BigFloat; c = rand(Ty,n); d = rand(Ty,n); c'*d-d'*c
0.0
I wonder if this behaviour is the expected one for DoubleFloat data ?
And even
julia> Ty = Double64; c = rand(Ty,1) ; d =rand(Ty,1); c[1]*d[1]-d[1]*c[1]
1.5407439555097887e-33
Thus, the multiplication is not commutative?