torchmd-net icon indicating copy to clipboard operation
torchmd-net copied to clipboard

Print warnings when incompatible parameters are used

Open RaulPPelaez opened this issue 1 year ago • 4 comments

The OP in this issue https://github.com/openmm/openmm-torch/issues/133 was led to believe the Graph Network is equivariant because TorchMD-Net provides the "equivariant_group" parameter (which is only used by TensorNet). This happens a lot in the current parameter handling, where "irrelevant" parameters are silently ignored in, sometimes subtle, ways.

For instance, if a user sets y_weight=1 but the dataset does not provide energies, the code just ignores y_weight.

I am opening this issue to discuss some mechanism for these things to result in, at least, a warning.

Arguably, the better way to handle these architecture-specific parameters would be to treat them similarly as the dataset_args parameter, so that one would have to write:

model: "graph-network"
model_args: 
    hidden_channels:128
    ...

So that it is clear that:

  1. The params are for the model.
  2. Some parameters work for some models but not for others.

However, this change would break old parameter files. Perhaps there could be a notion of versions in the parameter files?

RaulPPelaez avatar Feb 05 '24 17:02 RaulPPelaez

I think the problem is more about when the input.yaml is written. Perhaps, it's enough to modify this function excluding all the hparms that are not used by the model or in general avoiding to save the entire argparse.NameSpace().

AntonioMirarchi avatar Feb 05 '24 18:02 AntonioMirarchi

How about this syntax:

model: graph-network
  hidden_channels: 128

It's even simpler, and it matches the syntax for priors. We can maintain backward compatibility by continuing to support the old syntax for a while. If it doesn't find a parameter it's looking for under the model, it will look for it at the top level but print a warning telling the user that syntax is deprecated.

peastman avatar Feb 05 '24 23:02 peastman

The reason we didn't use this structured input in the past was too be able to set parameters from command line, but I am not sure that we still use the command line at all

On Tue, Feb 6, 2024, 00:39 Peter Eastman @.***> wrote:

How about this syntax:

model: graph-network hidden_channels: 128

It's even simpler, and it matches the syntax for priors. We can maintain backward compatibility by continuing to support the old syntax for a while. If it doesn't find a parameter it's looking for under the model, it will look for it at the top level but print a warning telling the user that syntax is deprecated.

— Reply to this email directly, view it on GitHub https://github.com/torchmd/torchmd-net/issues/271#issuecomment-1928511166, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB3KUOXUZOKWKTLEZHQ3FQLYSFURXAVCNFSM6AAAAABC2RKZ5CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRYGUYTCMJWGY . You are receiving this because you are subscribed to this thread.Message ID: @.***>

giadefa avatar Feb 06 '24 07:02 giadefa

Peter's is the way to go IMO. In principle this functionality would be compatible with the changes introduced by #239 (if/when), since implementing this one means writing a more sane parameter handling. Thanks guys.

RaulPPelaez avatar Feb 07 '24 10:02 RaulPPelaez