Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting up γ for ImageNet 256 #5

Open
Peterande opened this issue Jan 13, 2025 · 2 comments
Open

Setting up γ for ImageNet 256 #5

Peterande opened this issue Jan 13, 2025 · 2 comments

Comments

@Peterande
Copy link

image

Hello, great work, can you give me some advice on setting up γ? I can't summarize the rules well from the table, for example, the difference between FFHQ γ values of different resolutions is too large

@lanqingxi
Copy link

  1. I have also noticed this problem, in my application, when using FFHQ-256 configured gamma values and strategies, the loss of the discriminator in the pre-training period leads to abnormally large losses, even though lowering that configuration to 15, 1.5, again leads to large losses in the discriminator.
  2. my discriminator is based on a pre-trained caformer, is this due to the use of pre-trained weights?

@Blade6570
Copy link

Blade6570 commented Jan 14, 2025

Same problem. I have also tried with bigger version of PatchGAN and loss increases abnormally.

Update : Now I don't have the the problem of loss explosion with different gamma values. Most of them works fine. I removed all the normalisation layers (I had one SN by mistake) and borrowed the resblock and weight initialisation trick from R3GAN.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants