You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My team and myself are working on a RAG project and right now we are fine tuning a retrieval model using peft library. The issue is once we have the model fine-tuned, we couldn't load the local config and checkpoints using sentencetransformer.
Here is our hierarchy of the local path of the peft model
adapter_config.json
adapter_model.safetensors
....
When I look into the sentence-transformers package, the issue comes from the classTransformer.py which doesn't consider the situation that the model path is a peftmodel path: config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir)
So we have to comment this line and delete the config attribute at all and in the _load_model method, only keep this code: self.auto_model = AutoModel.from_pretrained(model_name_or_path, cache_dir=cache_dir)
Sincerely request. Could you please fix this issue or could you please tell me the correct way to load a peft model using sentencetransformer class?
The text was updated successfully, but these errors were encountered:
+1, I have the same use-case. I'm fine-tuning my embedding model using HF Trainer and with PEFT, but when trying to save the checkpoints for sentence-transformers usage, I face this exact same issue.
Dear UKPlab team,
My team and myself are working on a RAG project and right now we are fine tuning a retrieval model using peft library. The issue is once we have the model fine-tuned, we couldn't load the local config and checkpoints using
sentencetransformer
.Here is our hierarchy of the local path of the peft model
When I look into the
sentence-transformers
package, the issue comes from the classTransformer.py
which doesn't consider the situation that the model path is apeftmodel
path:config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir)
So we have to comment this line and delete the
config
attribute at all and in the_load_model
method, only keep this code:self.auto_model = AutoModel.from_pretrained(model_name_or_path, cache_dir=cache_dir)
Sincerely request. Could you please fix this issue or could you please tell me the correct way to load a peft model using sentencetransformer class?
The text was updated successfully, but these errors were encountered: