Skip to content

Commit

Permalink
docs: Change README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
chiayi-hsu committed Nov 14, 2024
1 parent b27c9e2 commit 71e9467
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 8 deletions.
16 changes: 9 additions & 7 deletions examples/safelora/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,15 @@ Then, fill in the paths for the base, aligned, and PEFT models according to your

from peft.utils.safelora import SafeLoraConfig, apply_safelora

config = SafeLoraConfig(base_model_path='../LLM_Models/llama-2-7b-hf/',\
aligned_model_path='../LLM_Models/llama-2-7b-chat-fp16/',
peft_model_path = '../finetuneLLM/finetuned_models/samsumBad-7b-fp16-peft-seed-42',
devices='cuda',
select_layers_type='threshold',
save_weights=True)

peft_path = "../finetuneLLM/finetuned_models/samsumBad-7b-fp16-peft-seed-42"
config = SafeLoraConfig(
base_model_path="meta-llama/Llama-2-7b-hf",
aligned_model_path="TheBloke/Llama-2-7B-Chat-fp16",
peft_model_path=peft_path,
device="cuda",
select_layers_type="threshold",
save_weights=True,
)
final_lora_weight = apply_safelora(config)

```
Expand Down
2 changes: 1 addition & 1 deletion examples/safelora/safelora_inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
base_model_path="meta-llama/Llama-2-7b-hf",
aligned_model_path="TheBloke/Llama-2-7B-Chat-fp16",
peft_model_path=peft_path,
devices="cuda",
device="cuda",
select_layers_type="threshold",
save_weights=True,
)
Expand Down

0 comments on commit 71e9467

Please sign in to comment.