TensorCast.jl
TensorCast.jl copied to clipboard
Faster capture macro?
This @capture replacement is 100x quicker, but fixing one pattern doesn't improve the package startup speed much.
A better test, timing both package loading and time to run the macro. On master, Julia 1.5, best of a few:
EscBook:TensorCast me$ time julia -e '@time (using TensorCast; TensorCast._macro(:( Z[i,k][j] := fun(A[i,:], B[j])[k] + C[k]^2 )))'
8.587048 seconds (26.63 M allocations: 1.457 GiB, 4.16% gc time)
real 0m9.496s
vs branch capture:
EscBook:TensorCast me$ time julia -e '@time (using TensorCast; TensorCast._macro(:( Z[i,k][j] := fun(A[i,:], B[j])[k] + C[k]^2 )))'
7.843518 seconds (26.06 M allocations: 1.428 GiB, 4.86% gc time)
real 0m8.775s
Also commenting out lazy.jl which loads LazyArrays:
EscBook:TensorCast me$ time julia -e '@time (using TensorCast; TensorCast._macro(:( Z[i,k][j] := fun(A[i,:], B[j])[k] + C[k]^2 )))'
3.471463 seconds (7.83 M allocations: 402.050 MiB, 2.25% gc time)
real 0m3.858s
and also static.jl:
EscBook:TensorCast me$ time julia -e '@time (using TensorCast; TensorCast._macro(:( Z[i,k][j] := fun(A[i,:], B[j])[k] + C[k]^2 )))'
3.474532 seconds (7.83 M allocations: 402.089 MiB, 2.07% gc time)
real 0m3.854s
Time to bring back Requires, perhaps.