pymoo
pymoo copied to clipboard
Using JAX instead of AutoGrad.
AutoGrad is currently not developed anymore and it is recommended to use JAX instead. It seems like a drop-in replacement and I'm curious to know if it's worth considering a shift? Moreover, the comparisions between JAX and AutoGrad seem very promising as JAX superseeds AutoGrad in terms of performance. What say?
Yes, very good comment. I have not really found the time yet to look into JAX
, but I am aware that autograd
is deprecated. If you are working on a method with gradients and would like to integrate JAX let me know. I can help you to get started, and I would be happy to integrate it as a standard into pymoo.
@julesy89 JAX
does seem to be a drop-in replacement. However, I have noticed that pymoo
tends to have specific APIs that uses the heart of autograd
. I might have to study some of the translational APIs for the same. I am keen to help out with using JAX under the hood but let it be known - JAX isn't necessarily faster than native numpy
at all times.
Hence, do we intend to attempt to support all three - numpy
, jax
and autograd
or do we just stick to one?
We could consider an optional backend
requirement like for instance where keras
does the same supporting multiple DL backends.
Much of the raw development for replacing autograd
with jax
is currently happening here (https://github.com/achillesrasquinha/pymoo/tree/jax) after which I will attempt to place a Pull Request.