Nathan Lambert

Results 148 comments of Nathan Lambert

Can you be more specific? This was not the intent. Are you looking at this [file](https://github.com/huggingface/diffusers/blob/main/examples/train_unconditional.py) or something different?

Yes that is on the longer-term road map. Moderate term we want to make the schedulers interface well with `numpy` (which then lets one use their own non-torch models), but...

FYI Jax precision makes the following test fail by rounding a very small number to zero: ``` def test_betas(self): self.check_over_configs(beta_start=0.01, beta_end=0.2) ```

I agree, this is a proof of concept. Eventually a lot of work would need to be done to make the tests torch, numpy, jax independent. This can be referenced...

Doing more digging, the model architecture is based on [GIN](https://paperswithcode.com/method/gin) and [SchNet](https://paperswithcode.com/method/schnet) (common graph neural networks). There are local and global parameters of the molecule, and different components use different...

I'll work through your comments soon @patrickvonplaten. CC the original author @MinkaiXu in case he has any time to look.

@georgosgeorgos and @MinkaiXu, sorry for the delay on merging. We got a little distracted by Stable Diffusion. I'll plan on merging the updates on main to this + the notebook...

Fighting the stale bot! We haven't forgotten about this, and actually moved it up the priority list today. Soon!

Alright, I kind of messed up that rebase so there are some repeated commits in the log. The changes are all here. This is much closer to release now. Unfortunately,...

@MinkaiXu, @patrickvonplaten moved fast and removed it [here](https://github.com/huggingface/diffusers/commit/b35bac4d3b1af7e2389809f96e8ada11da6cc503). An interesting little difference I am not aware of :)