torchopt
torchopt copied to clipboard
[Question] About `torch.optim.lr_scheduler` support
Questions
It seems that optimizers like torchopt.adam does not support torch.optim.lr_scheduler
Checklist
- [x] I have checked that there is no similar issue in the repo (required)
- [x] I have read the documentation (required)
@hccz95 Thanks for reporting this!
It seems that optimizers like
torchopt.adam
does not supporttorch.optim.lr_scheduler
.
Also, in TorchOpt, we use transformation-based mapping for all functionality. That means everything would be a state-less function, which isn't true for torch.optim.lr_scheduler
. We provide ours implementation of lr scheduler, which can be used as:
functional_adam = torchopt.adam(
lr=torchopt.schedule.linear_schedule(
init_value=1e-3, end_value=1e-4, transition_steps=10000, transition_begin=2000
)
)
adam = torchopt.Adam(
lr=torchopt.schedule.linear_schedule(
init_value=1e-3, end_value=1e-4, transition_steps=10000, transition_begin=2000
)
)
meta_adam = torchopt.MetaAdam(
lr=torchopt.schedule.linear_schedule(
init_value=1e-3, end_value=1e-4, transition_steps=10000, transition_begin=2000
)
)
BTW, there is a bug for the lr scheduler and we will fix it in PR #76. ~New PyPI wheels will be published soon.~ New PyPI wheels are released as 0.5.0.post2
.