pytorch-lightning icon indicating copy to clipboard operation
pytorch-lightning copied to clipboard

Feature request: Log optimizer and LR Scheduler hyperparams when set by LightningCLI

Open lodo1995 opened this issue 3 years ago • 5 comments

Discussed in https://github.com/Lightning-AI/lightning/discussions/11258

Originally posted by slinnarsson December 25, 2021 I'm using LightningCLI and setting the optimizer and learning rate using args like --optimizer AdamW --optimizer.lr 0.01. When running multiple runs with varying optimizers and learning rates, I would like these hyperparameters to show up in tensorboard, but I don't know how to make that happen. Using save_hyperparameters() in the __init__() of my module doesn't work, because the optimizer and learning rate are not parameters to __init_(). As I understand it, the optimizer is patched into the module class by LightningCLI. Can I make LightningCLI log these as hyperparameters?

Btw, the settings are saved correctly in config.yaml, they just don't show up in hparams.yaml.

Update: I found a workaround; I added optimizer and lr as arguments to __init__() even if I don't use them there. They get saved by save_hyperparameters() and show up in the Tensorboard logs.


It would be nice if LightningCLI could automate this, without the need for "hacks" in the model class. In my usecase, I specify the optimizer classpath and its init args directly in a yaml configuration file that I pass to the cli.

cc @borda @carmocca @mauvilsa @akihironitta

lodo1995 avatar Jul 08 '22 10:07 lodo1995

Hi!

The save_hyperparameters functionality is completely unrelated to the LightningCLI, in fact, the CLI config includes it and makes it redundant.

There are not plans to unify them at the moment. The suggested workaround looks valid to me if that is desired.

However, if we want to include these arguments in the logger's views, it should be done by having the loggers look for the config files instead.

carmocca avatar Jul 11 '22 21:07 carmocca

@carmocca this is unfortunate. Currently, using save_hyperparameters from within the LightningDataModule or LightningModule doesn't maintain any structure from the config file, which neatly separates the parameters for each relevant part of pytorch-lightning. As such, what alternative is there to logging parameters for the loggers, such as WandB? The combined raw variable names without context means that variables aren't directly identifiable with respect to what part they are meant to apply to. Worse, it may limit them as they might collide in case of similar parameters for LightningDataModule and LightningModule for instance.

Perhaps related: is there a way to prepend something to the variable name when calling save_hyperparameters()? Something like a context variable which would yield context/my_variable and allow WandB to neatly structure all parameters within the same context? There is something about frame for save_hyperparameters() but there is no explanation of what it is or how to use it.

EDIT: @lodo1995 If you set learning_rate as an internal parameter of your LightningModule (think self.learning_rate which can be set at your LightningModule instantiation), it can be logged in the config.yaml file when using LightningCLI. It's an alternative to your working solution. They are also logged when using save_hyperparameters() which you use.

kotchin avatar Aug 09 '22 15:08 kotchin

cc @awaelchli for the specific save_hyperparameters questions

carmocca avatar Aug 09 '22 15:08 carmocca

A big limitation of save_hyperparameters is that it only has what goes in __init__. However, it is very convenient to use dependency injection (which LightningCLI supports), in which case __init__ gets instances of objects and it is not known which parameters were be used to instantiate them. With LightningCLI it might make sense to extend SaveConfigCallback to do the logging of the hyperparameters.

mauvilsa avatar Aug 09 '22 16:08 mauvilsa

With LightningCLI it might make sense to extend SaveConfigCallback to do the logging of the hyperparameters.

Yes, we could add it to the end of its setup hook:

for logger in trainer.loggers:
    logger.log_hyperparams(self.config)

But I want to reiterate that this would be completely unrelated to the save_hyperparameters functionality. And I'm not sure about what would loggers do if hyperparameters are logged twice when both the LightningCLI and save_hyperameters was used with this addition.

carmocca avatar Aug 09 '22 16:08 carmocca

A big limitation of save_hyperparameters is that it only has what goes in init

I wouldn't call this a limitation. You wouldn't say a big limitation of a car is that it only has four wheels. save_hyperparameters is designed to save exactly the parameters that get sent to init, not more and not less. This way, given the class and the collection of these parameters, you can always reconstruct the object, which is what LightningModule.load_from_checkpoint does.

is there a way to prepend something to the variable name when calling save_hyperparameters()?

No, this would defeat the purpose of save_hyperparameters. This function does not do the logging of parameters. Its only responsibility is to capture the arguments passed to init and save them into a dict (self.hparams). The logging happens in the trainer, where it takes the saved hparams and sends them to the logger(s).

There is something about frame for save_hyperparameters() but there is no explanation of what it is or how to use it.

Yes, this needs to be removed from the user exposed API but this we can leave for a separate discussion :)

Worse, it may limit them as they might collide in case of similar parameters for LightningDataModule and LightningModule for instance.

Yes. We have a warning about that when it happens. One could structure the params by object and then let the logger add a prefix based on the object. But that wouldn't be done by the save_hyperparameters() method. Feel free to open an issue specific about this so it can be brainstormed.

awaelchli avatar Aug 11 '22 10:08 awaelchli

This is now supported via https://github.com/Lightning-AI/lightning/pull/17475

carmocca avatar May 04 '23 17:05 carmocca