Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why not provide 22k-supervised finetuning model??? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!!!!! 🙏🙏🙏 #72

Open
yan-hao-tian opened this issue Mar 22, 2024 · 2 comments

Comments

@yan-hao-tian
Copy link

yan-hao-tian commented Mar 22, 2024

Hi, I am looking for the 22k-supervised fine-tuning ConvNeXt-V2-H model without 1k-supervised fine-tuning. I want to use it to fine-tune on ade20k, reproducing the result in Table 7 of the paper.

@yan-hao-tian yan-hao-tian changed the title Is the 22k-supervised finetuning weights available? Is the 22k-supervised finetuning weights available (like ConvNeXt-V1)? Mar 22, 2024
@yan-hao-tian yan-hao-tian changed the title Is the 22k-supervised finetuning weights available (like ConvNeXt-V1)? Why not provide 22k-supervised finetuning model Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model Why not provide 22k-supervised finetuning model? It is uncommon in the community. Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model? It is uncommon in the community. Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 ! Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 ! Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!! Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!! Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!!!!! Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!!!!! Why not provide 22k-supervised finetuning model??? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!!!!! Mar 23, 2024
@yan-hao-tian yan-hao-tian changed the title Why not provide 22k-supervised finetuning model??? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!!!!! Why not provide 22k-supervised finetuning model??? I am really shocked by that every available ConvNeXt-V2 pre-training weights has been finetuned on imagenet-1k. Please make 22k-supervised ConvNeXt-V2 open just like ConvNeXt-V1 !!!!!! 🙏🙏🙏 Mar 23, 2024
@blackpearl1022
Copy link

@yan-hao-tian
You can see the pretrained ConvNeXt-V2 models on hugging face which were already fine tuned with ImageNet-22k.
For example, convnextv2_base.fcmae_ft_in22k_in1k_384 was pre-trained with ImageNet-22k and you can see other models as well on hugging face.

@yan-hao-tian
Copy link
Author

Thanks for the reply.

What I need is the ConvNeXt-v2-Huge weights ended with '22k', which means the model's last pre-training step is fine-tuning on ImageNet22k without then finetuning on ImageNet1k. Because the last line of table 7 in the ConvNeXt-v2 paper achieves 57.0 mIoU with it on ade20k dataset.

It seems nowhere I can find this model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants