You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 23, 2023. It is now read-only.
I was using Bayesian optimization for generating suggestions for next batch of points given labels for a few data points using the following snippet:
def DoE(x):
#Make prediction at the new point sampled by bayes_opt using SVR model trained earlier
ynew = svr.predict(x) # scikit-learn SVR model trained on existing labeled data
return -ynew # -ynew coz we want to maximize target variable
opt = GPyOpt.methods.BayesianOptimization(f = DoE, # function to optimize
domain = domain, # box-constrains of the problem
acquisition_type ='LCB', # LCB acquisition
acquisition_weight = 0.1, # Exploration-exploitation trade-off
model_type = 'GP',
num_cores = cores,
normalize_Y = True,
evaluator_type = 'local_penalization',
report_file = 'DoE_log.dat',
batch_size = bsize,
X = X_train,
Y = np.atleast_2d(Y_train),
initial_design_numdata = len(X_train))
I had two queries:
In the context of design of experiments wherein we have labels for a set of experimentally probed designs and a machine learning based model (like SVR in the above case) isn't it sufficient to just supply SVR predictor as function to optimize with model_type = 'None'? Currently, there doesn't seem to be 'None' option for model_type and I guess using 'GP' would lead to another surrogate model which may not be necessary.
[One of the earlier posts mentions to iteratively call next suggested sample however I believe it still uses the default 'GP' surrogate model https://github.com/How to initiate a GPyOpt if I do not know the functional form #81]
What should be the dimensions of X and Y while supplying initial data for Bayesian optimization? I am currently forcing both X and Y to be 2D numpy arrays but I ran into "flapack error". Here's the full trace:
Traceback (most recent call last):
File "svm_bayes_opt_v1.py", line 586, in
main()
File "svm_bayes_opt_v1.py", line 454, in main
initial_design_numdata = len(X_train))
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/methods/bayesian_optimization.py", line 244, in init
self.run_optimization(max_iter=0,verbosity=self.verbosity)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/methods/bayesian_optimization.py", line 458, in run_optimization
super(BayesianOptimization, self).run_optimization(max_iter = max_iter, max_time = max_time, eps = eps, verbosity=verbosity, save_models_parameters = save_models_parameters, report_file = report_file, evaluations_file= evaluations_file, models_file=models_file)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/core/bo.py", line 103, in run_optimization
self._update_model()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/core/bo.py", line 196, in _update_model
self.model.updateModel(self.X, self.Y,self.suggested_sample,self.Y_new)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/models/gpmodel.py", line 81, in updateModel
if self.model is None: self._create_model(X_all, Y_all)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/models/gpmodel.py", line 64, in _create_model
self.model = GPy.models.GPRegression(X, Y, kernel=kern, noise_var=noise_var)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/parameterized.py", line 54, in call
self.initialize_parameter()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/parameter_core.py", line 331, in initialize_parameter
self.trigger_update()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/updateable.py", line 79, in trigger_update
self._trigger_params_changed(trigger_parent)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/parameter_core.py", line 128, in _trigger_params_changed
self.notify_observers(None, None if trigger_parent else -np.inf)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/observable.py", line 91, in notify_observers
[callble(self, which=which) for _, _, callble in self.observers]
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/observable.py", line 91, in
[callble(self, which=which) for _, _, callble in self.observers]
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/parameter_core.py", line 498, in _parameters_changed_notification
self.parameters_changed()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPy/core/gp.py", line 193, in parameters_changed
self.posterior, self._log_marginal_likelihood, self.grad_dict = self.inference_method.inference(self.kern, self.X, self.likelihood, self.Y_normalized, self.mean_function, self.Y_metadata)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPy/inference/latent_function_inference/exact_gaussian_inference.py", line 47, in inference
alpha, _ = dpotrs(LW, YYT_factor, lower=1)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPy/util/linalg.py", line 126, in dpotrs
return lapack.dpotrs(A, B, lower=lower)
_flapack.error: failed in converting 2nd argument `b' of _flapack.dpotrs to C/Fortran array
Thanks,
Sanjan
The text was updated successfully, but these errors were encountered:
The GP model is computed to learn the mapping between the hyper-parameters of your SVR and its performance. Note that this is different to the actual model that you are tuning, the SVR.
You are correct, both inputs and outputs should be 2D arrays.
We did recently a new release. Can you try to see if your issues are still appear?
import GPy
from emukit.model_wrappers.gpy_model_wrappers import GPyModelWrapper
from emukit.bayesian_optimization.acquisitions import ExpectedImprovement
from emukit.core.optimization import AcquisitionOptimizer
kernel = GPy.kern.Matern52(X_init.shape[1], ARD=True)
gp = GPyModelWrapper(GPy.models.GPRegression(X_init, Y_init, kernel, noise_var=1e-10))
**gives error:~\Anaconda3\lib\site-packages\gpy-1.9.6-py3.6-win-amd64.egg\GPy\util\linalg.py in dpotrs(A, B, lower) :
error: failed in converting 2nd argument `b' of _flapack.dpotrs to C/Fortran array**
Hi all,
I was using Bayesian optimization for generating suggestions for next batch of points given labels for a few data points using the following snippet:
def DoE(x):
#Make prediction at the new point sampled by bayes_opt using SVR model trained earlier
ynew = svr.predict(x) # scikit-learn SVR model trained on existing labeled data
return -ynew # -ynew coz we want to maximize target variable
opt = GPyOpt.methods.BayesianOptimization(f = DoE, # function to optimize
domain = domain, # box-constrains of the problem
acquisition_type ='LCB', # LCB acquisition
acquisition_weight = 0.1, # Exploration-exploitation trade-off
model_type = 'GP',
num_cores = cores,
normalize_Y = True,
evaluator_type = 'local_penalization',
report_file = 'DoE_log.dat',
batch_size = bsize,
X = X_train,
Y = np.atleast_2d(Y_train),
initial_design_numdata = len(X_train))
I had two queries:
In the context of design of experiments wherein we have labels for a set of experimentally probed designs and a machine learning based model (like SVR in the above case) isn't it sufficient to just supply SVR predictor as function to optimize with model_type = 'None'? Currently, there doesn't seem to be 'None' option for model_type and I guess using 'GP' would lead to another surrogate model which may not be necessary.
[One of the earlier posts mentions to iteratively call next suggested sample however I believe it still uses the default 'GP' surrogate model https://github.com/How to initiate a GPyOpt if I do not know the functional form #81]
What should be the dimensions of X and Y while supplying initial data for Bayesian optimization? I am currently forcing both X and Y to be 2D numpy arrays but I ran into "flapack error". Here's the full trace:
Traceback (most recent call last):
File "svm_bayes_opt_v1.py", line 586, in
main()
File "svm_bayes_opt_v1.py", line 454, in main
initial_design_numdata = len(X_train))
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/methods/bayesian_optimization.py", line 244, in init
self.run_optimization(max_iter=0,verbosity=self.verbosity)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/methods/bayesian_optimization.py", line 458, in run_optimization
super(BayesianOptimization, self).run_optimization(max_iter = max_iter, max_time = max_time, eps = eps, verbosity=verbosity, save_models_parameters = save_models_parameters, report_file = report_file, evaluations_file= evaluations_file, models_file=models_file)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/core/bo.py", line 103, in run_optimization
self._update_model()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/core/bo.py", line 196, in _update_model
self.model.updateModel(self.X, self.Y,self.suggested_sample,self.Y_new)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/models/gpmodel.py", line 81, in updateModel
if self.model is None: self._create_model(X_all, Y_all)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPyOpt/models/gpmodel.py", line 64, in _create_model
self.model = GPy.models.GPRegression(X, Y, kernel=kern, noise_var=noise_var)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/parameterized.py", line 54, in call
self.initialize_parameter()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/parameter_core.py", line 331, in initialize_parameter
self.trigger_update()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/updateable.py", line 79, in trigger_update
self._trigger_params_changed(trigger_parent)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/parameter_core.py", line 128, in _trigger_params_changed
self.notify_observers(None, None if trigger_parent else -np.inf)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/observable.py", line 91, in notify_observers
[callble(self, which=which) for _, _, callble in self.observers]
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/observable.py", line 91, in
[callble(self, which=which) for _, _, callble in self.observers]
File "/home/sgupta78/sk17/lib/python3.5/site-packages/paramz/core/parameter_core.py", line 498, in _parameters_changed_notification
self.parameters_changed()
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPy/core/gp.py", line 193, in parameters_changed
self.posterior, self._log_marginal_likelihood, self.grad_dict = self.inference_method.inference(self.kern, self.X, self.likelihood, self.Y_normalized, self.mean_function, self.Y_metadata)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPy/inference/latent_function_inference/exact_gaussian_inference.py", line 47, in inference
alpha, _ = dpotrs(LW, YYT_factor, lower=1)
File "/home/sgupta78/sk17/lib/python3.5/site-packages/GPy/util/linalg.py", line 126, in dpotrs
return lapack.dpotrs(A, B, lower=lower)
_flapack.error: failed in converting 2nd argument `b' of _flapack.dpotrs to C/Fortran array
Thanks,
Sanjan
The text was updated successfully, but these errors were encountered: