You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How can we reduce the need to parameter tune params and optimizers and "borrow" parameters during review?
The proposal is to create a hyper parameter table which gives you the HPs to use based on the batch size and precision (and solve the optimizer choice problem).
SWG Notes:
This is a big topic being covered by Special Topics.
The text was updated successfully, but these errors were encountered:
There is a lot desire to have such a thing. We intend to address this through logging in the submissions. Currently, we plan on using the same general approach to hyper parameters as we used for v0.5; pending future conversation on this topic.
This is a design criteria for logging.
We may want to do some sharing of parameters before v0.6 submission. There will be further discussion for v0.6.
bitfort
added
Backlog
An issue to be discussed in a future Working Group, but not the immediate next one.
and removed
Next Meeting
Item to be discussed in the next Working Group
labels
Feb 13, 2020
How can we reduce the need to parameter tune params and optimizers and "borrow" parameters during review?
The proposal is to create a hyper parameter table which gives you the HPs to use based on the batch size and precision (and solve the optimizer choice problem).
SWG Notes:
This is a big topic being covered by Special Topics.
The text was updated successfully, but these errors were encountered: