GradDFT
GradDFT copied to clipboard
GradDFT is a JAX-based library enabling the differentiable design and experimentation of exchange-correlation functionals using machine learning techniques.
On an HPC cluster, each term in a mean square loss can be calculated using embarrassingly parallel logic. Unfortunately, the native way of doing this with `jax` (using `jax.vmap` and...
The advantage of implementing stresses is similar to that of forces (see #96 ), but applied to periodic systems. Much like for forces, these can be implemented by autodiff or...
Implementing ionic forces in Grad DFT would be useful as this provides information beyond the total energy and density for training neural functionals. There is potential here to strongly improve...
[Jax-XC ](https://github.com/sail-sg/jax_xc) allows to convert LibXC functionals to JAX. It would be cool to allow integration with our library, I think.
Grad DFT **is not** intended as a general purpose high performance DFT code. Its domain of applicability is for training neural functionals. Accordingly, if we wish to perform production level...
Right now GradDDFT implements the original expression for the LYP functional, eq 22 in > Lee, Yang & Parr [C. Lee, W. Yang, and R. G. Parr., Phys. Rev. B...
Add sharding following https://jax.readthedocs.io/en/latest/notebooks/Distributed_arrays_and_automatic_parallelization.html
Should we delete the unused branches @jackbaker1001 ?
I've noticed that for a few basis sets, NaN gradients are appearing again when trained using the DIIS SCF loops but not the linear mixing loops. I think this is...