Nathan Morgan
Nathan Morgan
@davidegraff Can you check if maintainers are allowed to push to your branch? We have merged in the CI changes and there are two small merge conflicts that I have...
> If you set `pad=not pad` for this featurizer ... Could you clarify what you mean here David? The `MultiHotAtomFeaturizer` doesn't take `pad` as an argument.
> > > If you set `pad=not pad` for this featurizer ... > > > > > > Could you clarify what you mean here David? The `MultiHotAtomFeaturizer` doesn't take...
Yesterday Oscar and I had discussions about this PR. I'll summarize a bit here and @oscarwumit /@kevingreenman can correct me if needed. ### Defaults We first focused on what the...
Agreed that resolving those CGR tests will take some time. So we can plan to include this in the 2.0 formal release and not the release candidate.
This issue goes beyond the CLI and is an important question wherever we use `lightning.pytorch.Trainer` in Chemprop. I think it is worth summarizing what the problem/question is. The same problem...
We could remove `MolGraphDataLoader`, but I've viewed it more as a helper wrapper around the native torch Dataloader. It helps automate the sampler so that users can get deterministic shuffling/sampling....
Correction: If we always pass a sampler then you have to shuffle. To give the user the option to not shuffle, this should keep the `sampler = None` logic. But...
#755 Is also about making Chemprop deterministic. I found that we don't have to set the seed for `MolGraphDataLoader` because it pull randomness from the pytorch RNG backend. If we...
Using `with torch.inference_mode()` was a patch until lightning could fix some dependency issues. These have been fixed now and `inference_mode = True` is the default. I'll put in a PR...