Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradient clipping in the wrong place? #24

Open
KinWaiCheuk opened this issue Dec 24, 2020 · 1 comment
Open

Gradient clipping in the wrong place? #24

KinWaiCheuk opened this issue Dec 24, 2020 · 1 comment

Comments

@KinWaiCheuk
Copy link

In your train.py line 101, you have the gradient clipping after the .backward() and .step() operations.

Shouldn't we put the clip_grad_norm_ in between .backward() and .step()?

@jongwook
Copy link
Owner

jongwook commented Jan 7, 2021

Yeah it definitely should. I'm not sure why I did like that. In practice the gradient was well below the clip value during most of the training though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants