Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve your state of the art by using best meta optimizer and best activation function #6

Open
LifeIsStrange opened this issue May 30, 2020 · 1 comment

Comments

@LifeIsStrange
Copy link

LifeIsStrange commented May 30, 2020

You could increase SMIM accuracy by using Ranger, which combine state of the art optimizers + gradient centralization
https://github.com/lessw2020/Ranger-Deep-Learning-Optimizer

Hortogonally, you would probably benefit from Mish too instead of the one you use (SGD ?) but should be tested after Ranger as it could regress accuracy (even if unlikely)
https://github.com/digantamisra98/Mish

You can expect from this big accuracy gains.
That would make your model even closer from perfection :)

@michalivne
Copy link
Collaborator

@LifeIsStrange thanks for your suggestions. We will look into them.

Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants