apex icon indicating copy to clipboard operation
apex copied to clipboard

TypeError: LARC is not an Optimizer

Open ZhiyuanChen opened this issue 4 years ago • 3 comments

optimizer  = torch.optim.SGD(
    model.parameters(),
    lr=args.learning_rate,
    momentum=0.9,
    weight_decay=1e-6,
)
optimizer = LARC(optimizer=optimizer, trust_coefficient=0.001, clip=False)
scheduler = WarmupLinearSchedule(
    optimizer,
    warmup_steps=args.warmup_proportion * num_train_optimization_steps,
    t_total=num_train_optimization_steps,
)    
Traceback (most recent call last):
  File "train.py", line 774, in <module>
    main()
  File "train.py", line 482, in main
    t_total=num_train_optimization_steps,
  File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/pytorch_transformers/optimization.py", line 56, in __init__
    super(WarmupLinearSchedule, self).__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)
  File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/torch/optim/lr_scheduler.py", line 189, in __init__
    super(LambdaLR, self).__init__(optimizer, last_epoch)
  File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/torch/optim/lr_scheduler.py", line 31, in __init__
    type(optimizer).__name__))
TypeError: LARC is not an Optimizer

ZhiyuanChen avatar Oct 08 '20 15:10 ZhiyuanChen

Hi, did you manage to solve this issue?

FraCorti avatar Mar 10 '22 15:03 FraCorti

Hi did you solved it

saniazahan avatar May 18 '22 13:05 saniazahan

I suppose giving the wrapped optimiser instead of LARC to the lr scheduler works fine:

_optimizer  = torch.optim.SGD(
    model.parameters(),
    lr=args.learning_rate,
    momentum=0.9,
    weight_decay=1e-6,
)
optimizer = LARC(optimizer=_optimizer, trust_coefficient=0.001, clip=False)
scheduler = WarmupLinearSchedule(
    _optimizer,
    warmup_steps=args.warmup_proportion * num_train_optimization_steps,
    t_total=num_train_optimization_steps,
)    

nzw0301 avatar Aug 30 '22 06:08 nzw0301