hydra-torch
hydra-torch copied to clipboard
[dev] Config Registration Functions in __init__.py
implements config registration for optim
via __init__.py
If this looks good, I'll broadcast the same strategy across the rest of the configs we currently have.
Partially Addressing #53
Before merging, I just want to point out that because of the way pytorch structures its files (and __init__.py
for certain modules), the optimizer register can be done like this:
hydra_configs.torch.optim.register()
however, losses and data-related must be done like this:
hydra_configs.torch.nn.modules.register()
hydra_configs.torch.utils.data.register()
For now it's not possible to do:
hydra_configs.torch.nn.modules.loss.register()
because loss.py
is an auto-generated file and we aren't putting logic in those.
Semantically, registering optimizers with optim.register()
is nicer than the case for loss which is registered by modules.register()
.
I'm open to ideas for ameliorating this semantic ambiguity.