torchmd-net
torchmd-net copied to clipboard
Print warnings when incompatible parameters are used
The OP in this issue https://github.com/openmm/openmm-torch/issues/133 was led to believe the Graph Network is equivariant because TorchMD-Net provides the "equivariant_group" parameter (which is only used by TensorNet). This happens a lot in the current parameter handling, where "irrelevant" parameters are silently ignored in, sometimes subtle, ways.
For instance, if a user sets y_weight=1 but the dataset does not provide energies, the code just ignores y_weight.
I am opening this issue to discuss some mechanism for these things to result in, at least, a warning.
Arguably, the better way to handle these architecture-specific parameters would be to treat them similarly as the dataset_args parameter, so that one would have to write:
model: "graph-network"
model_args:
hidden_channels:128
...
So that it is clear that:
- The params are for the model.
- Some parameters work for some models but not for others.
However, this change would break old parameter files. Perhaps there could be a notion of versions in the parameter files?
I think the problem is more about when the input.yaml is written. Perhaps, it's enough to modify this function excluding all the hparms that are not used by the model or in general avoiding to save the entire argparse.NameSpace().
How about this syntax:
model: graph-network
hidden_channels: 128
It's even simpler, and it matches the syntax for priors. We can maintain backward compatibility by continuing to support the old syntax for a while. If it doesn't find a parameter it's looking for under the model, it will look for it at the top level but print a warning telling the user that syntax is deprecated.
The reason we didn't use this structured input in the past was too be able to set parameters from command line, but I am not sure that we still use the command line at all
On Tue, Feb 6, 2024, 00:39 Peter Eastman @.***> wrote:
How about this syntax:
model: graph-network hidden_channels: 128
It's even simpler, and it matches the syntax for priors. We can maintain backward compatibility by continuing to support the old syntax for a while. If it doesn't find a parameter it's looking for under the model, it will look for it at the top level but print a warning telling the user that syntax is deprecated.
— Reply to this email directly, view it on GitHub https://github.com/torchmd/torchmd-net/issues/271#issuecomment-1928511166, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB3KUOXUZOKWKTLEZHQ3FQLYSFURXAVCNFSM6AAAAABC2RKZ5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRYGUYTCMJWGY . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Peter's is the way to go IMO. In principle this functionality would be compatible with the changes introduced by #239 (if/when), since implementing this one means writing a more sane parameter handling. Thanks guys.