Replies: 2 comments
-
Set the lr_config = dict(
policy="step",
warmup="linear",
warmup_iters=500,
warmup_ratio=0.001,
step=[70_000],
by_epoch=False,
) |
Beta Was this translation helpful? Give feedback.
0 replies
-
In the config you can also specify the runner type like: runner = dict(type="IterBasedRunner", max_iters=200_000) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, how can I update the learning rate by iteration but not by epoch when using an epochbasedrunner?
My other question is: which part of the codes should I modify to substitute epochbasedrunner with iterbasedrunner?
Looking forward for the reply :)
Beta Was this translation helpful? Give feedback.
All reactions