pytorch_tempest icon indicating copy to clipboard operation
pytorch_tempest copied to clipboard

Add new optimizers

Open Erlemar opened this issue 4 years ago • 0 comments

It is always good to have more options to choose. So it would be a good idea to add more optimizers. The steps are the following:

  • in conf/optimizer add a config for a new optimizer
  • if this optimizer requires some other library, update requirements
  • run tests to check that everything works with command pytest

Example: https://github.com/Erlemar/pytorch_tempest/blob/master/conf/optimizer/adamw.yaml

# @package _group_
class_name: torch.optim.AdamW
params:
  lr: ${training.lr}
  weight_decay: 0.001
  • # @package _group_ - default necessary line for hydra
  • class_name - full name/path to the object
  • params: parameters, which are overriden. If optimizer has more parameters than defined in config, then default values will be used.

There are 3 possible cases of adding an optimizer:

  • default pytorch optimizers. Simply add config for it.
  • optimizer from another library. Add this library to requirements, define config with class_name based on the library. For example adamp.AdamP
  • optimizer from custom class. Add class to src/optimizers and add config with full path to the class

Erlemar avatar Oct 05 '20 11:10 Erlemar