Replies: 5 comments 3 replies
-
Having an issue with this as well. I see others are commenting on it in issue 880 |
Beta Was this translation helpful? Give feedback.
-
Same error with skopt. I use the latest versions of the libraries. #Import
import yfinance as yf
import ta
import pandas as pd
from backtesting import Backtest, Strategy
from backtesting.lib import crossover #optimize skopt
optim = bt.optimize(n1 = range(1,100,1),
n2 = range(1,100,1),
constraint = lambda x: x.n2 - x.n1 > 20,
maximize = 'Equity Final [$]',
method='skopt',
max_tries=200,
random_state=0)
bt.plot()
optim #Result I run my code on google colab with Python version
The error always appears at the same iteration. If I use the Anyone have an idea or found the solution? |
Beta Was this translation helpful? Give feedback.
-
I think I found the solution! Or at least I manage to do the optimization without error!The problem comes from several files :
Correction proposal :
for constraint in constraints:
if constraint.is_satisfied_by(param_val):
# this constraint is satisfied, no need to check further.
break
else:
# No constraint is satisfied, raise with an informative message.
# Ignore constraints that we don't want to expose in the error message,
# i.e. options that are for internal purpose or not officially supported.
constraints = [
constraint for constraint in constraints if not constraint.hidden
]
if len(constraints) == 1:
constraints_str = f"{constraints[0]}"
else:
constraints_str = (
f"{', '.join([str(c) for c in constraints[:-1]])} or"
f" {constraints[-1]}"
)
raise InvalidParameterError(
f"The {param_name!r} parameter of {caller_name} must be"
f" {constraints_str}. Got {param_val!r} instead."
)
res = forest_minimize(
func=objective_function,
dimensions=dimensions,
n_calls=max_tries,
base_estimator=ExtraTreesRegressor(n_estimators=20, min_samples_leaf=2, criterion="squared_error"),
acq_func='LCB',
kappa=3,
n_initial_points=min(max_tries, 20 + 3 * len(kwargs)),
initial_point_generator='lhs', # 'sobel' requires n_initial_points ~ 2**N
callback=DeltaXStopper(9e-7),
random_state=random_state) I would need you to know if it is
mean = super(ExtraTreesRegressor, self).predict(X)
if return_std:
if self.criterion != "squared_error":
raise ValueError(
"Expected impurity to be 'squared_error', got %s instead"
% self.criterion)
std = _return_std(X, self.estimators_, mean, self.min_variance)
return mean, std
return mean I don't know if this is a correct way to solve this problem and if the optimization works normally. I would need your help to know if it is valid what I explain and if it is necessary to use
|
Beta Was this translation helpful? Give feedback.
-
After couple minutes, The solution is We need to uninstall scikit-learning and install older version
|
Beta Was this translation helpful? Give feedback.
-
I also had to downgrade numpy from 1.25 to 1.22.0 |
Beta Was this translation helpful? Give feedback.
-
The actual error is :
sklearn.utils._param_validation.InvalidParameterError: The 'criterion' parameter of ExtraTreesRegressor must be a str among {'absolute_error', 'squared_error', 'friedman_mse', 'poisson'}. Got 'mse' instead
.Tried on several strategies, get the same error at 14% completion. A very simple strategy (below) throws the error. Any insight on how to fix would be very nice.
Beta Was this translation helpful? Give feedback.
All reactions