Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to load lora model to sentencetransformer model? #2465

Closed
GeekerSsy opened this issue Feb 2, 2024 · 3 comments · Fixed by #2980
Closed

How to load lora model to sentencetransformer model? #2465

GeekerSsy opened this issue Feb 2, 2024 · 3 comments · Fixed by #2980

Comments

@GeekerSsy
Copy link

Dear UKPlab team,

My team and myself are working on a RAG project and right now we are fine tuning a retrieval model using peft library. The issue is once we have the model fine-tuned, we couldn't load the local config and checkpoints using sentencetransformer.
Here is our hierarchy of the local path of the peft model

  • adapter_config.json
  • adapter_model.safetensors
  • ....

When I look into the sentence-transformers package, the issue comes from the classTransformer.py which doesn't consider the situation that the model path is a peftmodel path:
config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir)
So we have to comment this line and delete the config attribute at all and in the _load_model method, only keep this code:
self.auto_model = AutoModel.from_pretrained(model_name_or_path, cache_dir=cache_dir)

Sincerely request. Could you please fix this issue or could you please tell me the correct way to load a peft model using sentencetransformer class?

@DoctorSlimm
Copy link

bump

@g-karthik
Copy link

+1, I have the same use-case. I'm fine-tuning my embedding model using HF Trainer and with PEFT, but when trying to save the checkpoints for sentence-transformers usage, I face this exact same issue.

@tomaarsen can you please help take a look?

@here0009
Copy link

+1 for the same issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants