Skip to content

Commit

Permalink
Removed debug dump of universal checkpoints
Browse files Browse the repository at this point in the history
  • Loading branch information
TevenLeScao authored Feb 21, 2023
1 parent 09a35f5 commit e52bdab
Showing 1 changed file with 0 additions and 3 deletions.
3 changes: 0 additions & 3 deletions megatron/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -460,9 +460,6 @@ def setup_model_and_optimizer(model_provider_func):

else:
args.iteration = 0

from .utils import dump_weights
dump_weights(f'{args.universal_checkpoint=}', args.iteration, model, optimizer)

# tp_rank = mpu.get_tensor_model_parallel_rank()
# pp_rank = mpu.get_pipeline_model_parallel_rank()
Expand Down

0 comments on commit e52bdab

Please sign in to comment.