apex
apex copied to clipboard
TypeError: LARC is not an Optimizer
optimizer = torch.optim.SGD(
model.parameters(),
lr=args.learning_rate,
momentum=0.9,
weight_decay=1e-6,
)
optimizer = LARC(optimizer=optimizer, trust_coefficient=0.001, clip=False)
scheduler = WarmupLinearSchedule(
optimizer,
warmup_steps=args.warmup_proportion * num_train_optimization_steps,
t_total=num_train_optimization_steps,
)
Traceback (most recent call last):
File "train.py", line 774, in <module>
main()
File "train.py", line 482, in main
t_total=num_train_optimization_steps,
File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/pytorch_transformers/optimization.py", line 56, in __init__
super(WarmupLinearSchedule, self).__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)
File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/torch/optim/lr_scheduler.py", line 189, in __init__
super(LambdaLR, self).__init__(optimizer, last_epoch)
File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/torch/optim/lr_scheduler.py", line 31, in __init__
type(optimizer).__name__))
TypeError: LARC is not an Optimizer
Hi, did you manage to solve this issue?
Hi did you solved it
I suppose giving the wrapped optimiser instead of LARC to the lr scheduler works fine:
_optimizer = torch.optim.SGD(
model.parameters(),
lr=args.learning_rate,
momentum=0.9,
weight_decay=1e-6,
)
optimizer = LARC(optimizer=_optimizer, trust_coefficient=0.001, clip=False)
scheduler = WarmupLinearSchedule(
_optimizer,
warmup_steps=args.warmup_proportion * num_train_optimization_steps,
t_total=num_train_optimization_steps,
)