Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unstable convergence PSNR #46

Closed
stc1995 opened this issue Dec 1, 2020 · 1 comment
Closed

Unstable convergence PSNR #46

stc1995 opened this issue Dec 1, 2020 · 1 comment

Comments

@stc1995
Copy link

stc1995 commented Dec 1, 2020

Thanks for your reimplementation.
However, in my 5 trials of running the original training code of Lego, the results are unstable. In detail, 3 trials results are ended with PSNR in 9~10, the other two trials ended with PSNR above 30.
Did you encounter similar issues? Thanks for your response!

@kwea123
Copy link
Owner

kwea123 commented Dec 1, 2020

It highly depends on the network initialization and the first training samples. Since there is a large portion of white background, if the network overfits to this background at the beginning, it makes the result very bad. You can see a solution here, or simply increase the batch size or trying other optimizers such as radam or ranger might help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants