Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

What is the difference between "LearningRateDecayOptimizerConstructor" with the default one? #116

Open
Shiweiliuiiiiiii opened this issue Jul 16, 2022 · 1 comment

Comments

@Shiweiliuiiiiiii
Copy link

Dear authors,

Do I have to use ''LearningRateDecayOptimizerConstructor'' to reproduce the results of ConvNeXt on segmentation?

Can I use the default one but with the same decay parameters like this?

optimizer = dict(delete=True, type='AdamW',
lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05,
paramwise_cfg={'decay_rate': 0.9,
'decay_type': 'stage_wise',
'num_layers': 12})

Many thanks,
Shiwei

@yxchng
Copy link

yxchng commented Aug 4, 2022

any updates?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants