Skip to content
This repository has been archived by the owner on Feb 23, 2023. It is now read-only.

batch>1 does not work for local_penalization #183

Open
elzurdo opened this issue May 18, 2018 · 2 comments
Open

batch>1 does not work for local_penalization #183

elzurdo opened this issue May 18, 2018 · 2 comments

Comments

@elzurdo
Copy link

elzurdo commented May 18, 2018

GPyOpt version 1.2.1
python3 version : 3.6.5

I would like to run suggest_next_locations with batches of batch_size> 1, when setting evaluator_type='local_penalization' but there seems to be a problem with pred (see below) being a list where it should probably be a numpy array.

Doing:

x = np.linspace(-1, 1)
x = x.reshape([len(x),1])

y = x ** 2

domain = [{'name': 'x', 'type':'continuous', 'domain':(-1., 1.)}]

bo = BayesianOptimization(f=None, domain=domain, evaluator_type='local_penalization', batch_size=2, 
                          X=x, Y=y, model_type='GP_MCMC', acquisition_type='EI_MCMC')
bo.suggest_next_locations()

I get error message:
'<' not supported between instances of 'list' and 'float'

And more in detail

~/Work/Envs/develop/lib/python3.6/site-packages/GPyOpt/acquisitions/LP.py in _hammer_function_precompute(self, x0, L, Min, model)
     54         m = model.predict(x0)[0]
     55         pred = model.predict(x0)[1].copy()
---> 56         pred[pred<1e-16] = 1e-16
     57         s = np.sqrt(pred)
     58         r_x0 = (m-Min)/L

Locally I tried
pred= np.array(pred), and this solved for batch_size=2, but had a different error message for batch=3. I will create a subsequent issue for this.

When using
evaluator_type='random' the original code works fine.

The evaluator_type='thompson_sampling' option yields a different error that I will address in a subsequent issue.

@elzurdo
Copy link
Author

elzurdo commented May 21, 2018

I now notice that in my system that the GPyOpt_parallel_optimization.ipynb tutorial does work when I use BO_demo_parallel.suggest_next_locations().

I find that the problem (or my conceptual mistake?) is that I setmodel_type='GP_MCMC' and acquisition_type='EI_MCMC'. When using GP and EI (or LCB), respectively, it seems to work fine.

Since I am not sure if this is a bug in the code, or rather more likely, in my understanding, I leave it for the powers above to decide if to close or further discuss this issue. @javiergonzalezh

@LarsHH
Copy link

LarsHH commented Feb 5, 2019

+1
I get the same error when using GP_MCMC/EI_MCMC with batch size > 1 and local penalization. GP/EI works fine.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants