Skip to content

Commit

Permalink
Track grad_norm in megatron/training.py
Browse files Browse the repository at this point in the history
  • Loading branch information
saforem2 committed Jul 20, 2024
1 parent 019dc3c commit 54bd608
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion megatron/training.py
Original file line number Diff line number Diff line change
Expand Up @@ -941,7 +941,7 @@ def train_step(
# Update learning rate.
if args.deepspeed:
skipped_iter = 0
grad_norm = None
grad_norm = model[0].get_global_grad_norm()
num_zeros_in_grad = None
loss_reduced = {}
for key in losses_reduced[0]:
Expand Down

0 comments on commit 54bd608

Please sign in to comment.