Skip to content

Commit

Permalink
change nproc to 8
Browse files Browse the repository at this point in the history
  • Loading branch information
felipemello1 authored Oct 29, 2024
1 parent f18d1a9 commit 612dd63
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions recipes/configs/llama3_2_vision/90B_full.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
# tune download meta-llama/Llama-3.2-90B-Vision-Instruct --output-dir /tmp/Llama-3.2-90B-Vision-Instruct --ignore-patterns "original/consolidated*"
#
# To launch on a single device, run the following command from root:
# tune run --nproc_per_node 4 full_finetune_distributed --config llama3_2_vision/90B_full
# tune run --nproc_per_node 8 full_finetune_distributed --config llama3_2_vision/90B_full
#
# You can add specific overrides through the command line. For example
# to override the checkpointer directory while launching training:
# tune run --nproc_per_node 4 full_finetune_distributed --config llama3_2_vision/90B_full checkpointer.checkpoint_dir=<YOUR_CHECKPOINT_DIR>
# tune run --nproc_per_node 8 full_finetune_distributed --config llama3_2_vision/90B_full checkpointer.checkpoint_dir=<YOUR_CHECKPOINT_DIR>
#
# This config works best when the model is being fine-tuned on 2+ GPUs.

Expand Down

0 comments on commit 612dd63

Please sign in to comment.