You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A new class to wrap the optimization framework Optuna. CustomOptunaOptimize can be used to create custom wrapper classes for various Optuna optimizations, that play
nicely with tpcp and can be nested within tpcp operations. (#27)
A new example for the CustomOptunaOptimize wrapper that explains how to create complex custom optimizers using Optuna and the new Scorer callbacks (see below) (#27)
Scorer now supports an optional callback function, which will be called after each datapoint is scored.
(#29)
Pipelines, Optimize objects, and Scorer are now Generic. This improves typing (in particular with VsCode), but
means a little bit more typing (pun intended), when creating new Pipelines and Optimizers
(#29)
Added option for scoring function to return arbitrary additional information using the NoAgg wrapper
(#31)
(experimental) Torch compatibility for hash based comparisons (e.g. in the safe_run wrapper). Before the wrapper
would fail, with torch module subclasses, as their pickle based hashes where not consistent.
We implemented a custom hash function that should solve this.
For now, we will consider this feature experimental, as we are not sure if it breaks in certain use-cases.
(#33)
tpcp.types now exposes a bunch of internal types that might be helpful to type custom Pipelines and Optimizers.
(#34)
Changed
The return type for the individual values in the Scorer class is not List[float] instead of np.ndarray.
This also effects the output of cross_validate, GridSearch.gs_results_ and GridSearchCV.cv_results_
(#29)
cf now has "faked" return type, so that type checkers in the user code, do not complain anymore.
(#29)
All TypeVar Variables are now called SomethingT instead of Something_ (#34)