We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The AdeMAMix optimizer is a simple modification of the Adam optimizer with a mixture of two EMAs to better take advantage of past gradients.
The paper has optax skeleton code which I could contribute if the maintainers deem this a good fit for the repo.
optax
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
The AdeMAMix optimizer is a simple modification of
the Adam optimizer with a mixture of two EMAs to better take advantage of past gradients.
The paper has
optax
skeleton code which I could contribute if the maintainers deem this a good fit for the repo.The text was updated successfully, but these errors were encountered: