Skip to content

Commit

Permalink
Updated clipping to torch.nn.utils.clip_grad_norm_ (pytorch#403)
Browse files Browse the repository at this point in the history
Description: Per the torch 0.4 documentation, torch.nn.utils.clip_grad_norm has been deprecated in favor of torch.nn.utils.clip_grad_norm_, this commit corrects the deprecated usage in word_language_model/main.py
  • Loading branch information
kartik144 authored and colesbury committed Aug 20, 2018
1 parent 2fc0211 commit 6fd43cd
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion word_language_model/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -159,7 +159,7 @@ def train():
loss.backward()

# `clip_grad_norm` helps prevent the exploding gradient problem in RNNs / LSTMs.
torch.nn.utils.clip_grad_norm(model.parameters(), args.clip)
torch.nn.utils.clip_grad_norm_(model.parameters(), args.clip)
for p in model.parameters():
p.data.add_(-lr, p.grad.data)

Expand Down

0 comments on commit 6fd43cd

Please sign in to comment.