Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Roll-forward with fixes: Fix interaction between scheduler.step() and gradient accumulation steps, refactor schedulers to use LambdaLR, and add cosine annealing LR scheduler as a decay method. #3555

Merged
merged 4 commits into from
Aug 29, 2023

Commits on Aug 28, 2023

  1. Revert "DRAFT: Revert "Add Cosine Annealing LR scheduler as a decay m…

    …ethod (#3507)" (#3545)"
    
    This reverts commit feec8a6.
    justinxzhao committed Aug 28, 2023
    Configuration menu
    Copy the full SHA
    03657a0 View commit details
    Browse the repository at this point in the history

Commits on Aug 29, 2023

  1. Configuration menu
    Copy the full SHA
    0141421 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    213b342 View commit details
    Browse the repository at this point in the history
  3. Fix tests.

    justinxzhao committed Aug 29, 2023
    Configuration menu
    Copy the full SHA
    e9777cd View commit details
    Browse the repository at this point in the history