Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some doubts about gradient when train GAN #2

Open
UmiUmiU2333 opened this issue Oct 24, 2020 · 1 comment
Open

Some doubts about gradient when train GAN #2

UmiUmiU2333 opened this issue Oct 24, 2020 · 1 comment

Comments

@UmiUmiU2333
Copy link

Hi, I have some doubts about step.5.2 update D network in line 924 of runGAN.py.

If we run loss.backward() in line 942 and line 953, and run optimizer.step() in line1156 and line1157, then gradients computed by line 942 and line 953 may be used to update G & D, while G should not be updated here in general.

@ginobilinie
Copy link
Owner

@UmiUmiU2333 Theoretically, you're right about this point. However, it doesnot matter much in practice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants