-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model Hyperparameters #10
Comments
Hi, thanks for your attention. According to our paper, we use the 256 x 256 crop size to train our model, which differs from the default 128 x 128. The reason for the default 128 x 128 is to utilize the progressive training strategy, whose initial size is 128 x 128, and then you could enlarge the training size to 256 x 256. The details about it come from Restormer. I have seen the results from your training model, you could change the training size to obtain superior results. |
Thank you so much for your quick response. So should I make training crop size to 256 and keep the test crop size to 128? Is it better to change both to 256? |
According to my experience, adjusting both to 256 (the crop size for training is the same as the size for slice inference) should have better results. |
I see, I will try it out. |
Looking forward to your results. We also have a diffusion-based paper on adverse weather restoration accepted by ECCV'24. The paper and code will be available soon. If you are interested, you can pay attention to it and look forward to face-to-face communication at the venue. |
Did you use a 256 x 256 crop size in the configuration to train the agan model or is it the default 128 x 128? I have tried to replicate your results for the agan weights but get grids (block edges, shown below) in my final image and was wondering it may be because my crop size was too small.
The text was updated successfully, but these errors were encountered: