Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: LARC is not an Optimizer #978

Open
ZhiyuanChen opened this issue Oct 8, 2020 · 3 comments
Open

TypeError: LARC is not an Optimizer #978

ZhiyuanChen opened this issue Oct 8, 2020 · 3 comments

Comments

@ZhiyuanChen
Copy link
Contributor

optimizer  = torch.optim.SGD(
    model.parameters(),
    lr=args.learning_rate,
    momentum=0.9,
    weight_decay=1e-6,
)
optimizer = LARC(optimizer=optimizer, trust_coefficient=0.001, clip=False)
scheduler = WarmupLinearSchedule(
    optimizer,
    warmup_steps=args.warmup_proportion * num_train_optimization_steps,
    t_total=num_train_optimization_steps,
)    
Traceback (most recent call last):
  File "train.py", line 774, in <module>
    main()
  File "train.py", line 482, in main
    t_total=num_train_optimization_steps,
  File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/pytorch_transformers/optimization.py", line 56, in __init__
    super(WarmupLinearSchedule, self).__init__(optimizer, self.lr_lambda, last_epoch=last_epoch)
  File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/torch/optim/lr_scheduler.py", line 189, in __init__
    super(LambdaLR, self).__init__(optimizer, last_epoch)
  File "/mnt/lustre/chenzhiyuan/anaconda3/envs/pt1.5/lib/python3.7/site-packages/torch/optim/lr_scheduler.py", line 31, in __init__
    type(optimizer).__name__))
TypeError: LARC is not an Optimizer
@FraCorti
Copy link

Hi, did you manage to solve this issue?

@saniazahan
Copy link

Hi did you solved it

@nzw0301
Copy link

nzw0301 commented Aug 30, 2022

I suppose giving the wrapped optimiser instead of LARC to the lr scheduler works fine:

_optimizer  = torch.optim.SGD(
    model.parameters(),
    lr=args.learning_rate,
    momentum=0.9,
    weight_decay=1e-6,
)
optimizer = LARC(optimizer=_optimizer, trust_coefficient=0.001, clip=False)
scheduler = WarmupLinearSchedule(
    _optimizer,
    warmup_steps=args.warmup_proportion * num_train_optimization_steps,
    t_total=num_train_optimization_steps,
)    

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants