Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Freezing layers for DiNAT model #111

Open
Pari-singh opened this issue Jan 10, 2024 · 0 comments
Open

Freezing layers for DiNAT model #111

Pari-singh opened this issue Jan 10, 2024 · 0 comments

Comments

@Pari-singh
Copy link

Hi @praeclarumjj3,
I trained the DiNAT backbone model for my custom images and got decent results. Now, I want to perform finetuning on those trained weights for some of the internal tasks, where I will have 500 new images on a regular basis. Thus, you understand that combining entire data and retraining is a kill, hence I am looking for a way to be able to finetune the weights on oncoming 500 images. However, I couldn't find a way to freeze layers for DiNAT. The config file (unlike that for resnet) does not have FREEZE option for MODEL.BACKBONE. Can any of you give more info on how to approach this problem.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant