Skip to content

Commit

Permalink
[Test] Save local model path in PEFT adapter config
Browse files Browse the repository at this point in the history
  • Loading branch information
ebsmothers committed Nov 19, 2024
1 parent bce7091 commit c69a4f8
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions recipes/lora_finetune_single_device.py
Original file line number Diff line number Diff line change
Expand Up @@ -619,6 +619,7 @@ def save_checkpoint(self, epoch: int) -> None:
self._apply_lora_to_output,
),
"peft_type": "LORA",
"base_model_name_or_path": str(self._checkpointer._checkpoint_dir),
}
ckpt_dict.update({training.ADAPTER_CONFIG: adapter_config})

Expand Down

0 comments on commit c69a4f8

Please sign in to comment.