Patrick von Platen
Patrick von Platen
Ok for me to rename the function arguments. I'd prefer to go over a deprecation cycle though that would still allow `allow_regex` and `ignore_regex` to be passed, but would throw...
From a first look I'm not too much in favor of this, but it's very likely that I don't grasp all of the problems and the proposed solution. Some questions...
Okey this sounds much better then! 1.) Commit messages are very important though IMO, *e.g.* when uploading checkpoints every "n" steps during training with `.push_to_hub(...)`, it's very important to see...
Hey @taki0112, I think `use_conv=False` is correct here since the integration tests for those models all pass: https://huggingface.co/models?arxiv=arxiv:2011.13456 Closing this for now. Please ping me or re-open if you find...
Hey @vvvm23, It's set to False because we don't want to train those parameters. I followed the implementaton of the original model here: https://github.com/yang-song/score_sde_pytorch/blob/1618ddea340f3e4a2ed7852a0694a809775cf8d0/models/layerspp.py#L37 Does this make sense?
Hey @vvvm23, sinusoidal position features like `GaussianFourierProjection` don't need training because every embedding already has a distinctly different vector that the model can use a "cue" to know what time...
Hey @hysts, In my experience it's fine to do `generator = torch.manual_seed(0)`. You are correct that this sets the seed globally but the current generator state is kept in `generator`....
Hey @hysts, Thanks a lot for the great write-up! This makes sense to me and I think we should change it - @anton-l could you take a look here?
Additionally, we can also better set the device using: ```py generator = torch.Generator(device=torch_device).manual_seed(0) ```
Let's maybe fix this everywhere