Yixiao Chen

Results 11 comments of Yixiao Chen

Thank you for the reply! Unfortunately I need the x64 precision in my application. For now I managed to bypass it by using the custom complex multiplication as in the...

I think this problem is gone in recent jax versions. I will close the issue.

It's fantastic to see jax developers working on this as well! Please allow me to (shamelessly) put my previous effort here (https://github.com/y1xiaoc/fwdlap), which follows similar idea of writing a new...

@mattjj Thanks a lot! Yes, I would definitely be more than happy to collaborate on this! Please let me know what you think would be the best way to do...

@n-gao Thanks for the introduction of folx! Maintaining sparsity in a custom interpreter is fantastic and indeed very important! I think the paper author's code was also trying to do...

For me the most straightforward thing is to have the (semi-)public API for symbolic zeros. I think this can be beneficial to a lot of applications that requires tweak around...

Thanks for the question. This happens occationally when the first iteration is trained so "hard" that it overfits the data. I would suggest reduce the number of training epoch for...

To clarify, you may want to reduce the training length for the _init_ train (`iter.init`), which is controlled by paramteres in `init_train`, like [here](https://github.com/deepmodeling/deepks-kit/blob/7978b398a5edd688e3fffc3c29f839d3c7b99602/examples/water_cluster/args.yaml#L188) in the example. I think the...

You may need to use the args.yaml. You can check the iter.init folder to see if the training has been redone with reduced epochs.

Good catch! Do you want to write a PR for this?