Analysis of time spent precompiling
With the new profiling tools in Julia it is now easier to get a good look at what happens during precompilation. Since OrdinaryDiffEq has a quite large compilation time, I did a bit of analysis.
The profile trace can be seen online in https://topolarity.github.io/trace-viewer/?trace=https%3A%2F%2Fraw.githubusercontent.com%2FKristofferC%2Ftracy-traces%2Fmain%2Forddiffeq_precompile.tracy&size=9821725.
The top sections where time is spent is:
So in summary, we are compiling a lot of different methods which means we have to infer and compile them. Not strange.
These are the inference time of the different methods that take the most time:
One perhaps interesting note is that it seems we are compiling 28 versions of solve_call with a CompositeAlgorithm. That seems like a lot? Each such solve_call seems to take around 500ms to infer:
Some possible ways to improve things:
- Precompile fewer solvers by default and leave it up to users of the package to precompile their specific workloads.
- Maybe look into why 28 versions of
CompositeAlgorithmis precompiled. - Try to make fewer things specialize on the concrete algorithm so that compiling each algorithm is cheaper.