-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot load the fine-tuned ckpt #51
Comments
What is the model you are using? It seems the initialized model does not contain fc_norm and head. The correct model to use for fine-tuned models should be vit_large_patch16 from models_vit_mage.py, instead of vit_large_patch16 from models_mage.py. |
Hello, I have encountered the same issue. In "main_finetune. py", the model was imported through "models_vit_mage. py" with the names "vit_base_patch16" and "vit_large_patch16", while in "gen_img_uncond. py", the model was imported from "models_mage. py" with the names "mage_vit_base_patch16" and "mage_vit_large_patch16", When using a model trained through "main_finetune. py" in "gen_img_uncond. py", it prompts this question. I am not sure if it is possible to roughly modify "model = models_mage.dict[args.model]......." to "model = models_vit_mage.dict[args.model]......, because their parameters are different, how can I modify them to use the model I trained myself for generation? |
@rememberBr A fine-tuned model is fine-tuned for classification and cannot be used for generation. |
OK,thanks |
I am loading the mage-vitl-ft.pth but didn't work. do we need conversion scripts?
The text was updated successfully, but these errors were encountered: