Skip to content
This repository has been archived by the owner on Feb 23, 2023. It is now read-only.

Handling invalid values #136

Open
mtanti opened this issue Dec 12, 2017 · 5 comments
Open

Handling invalid values #136

mtanti opened this issue Dec 12, 2017 · 5 comments

Comments

@mtanti
Copy link

mtanti commented Dec 12, 2017

Is there a way to make the objective function signal that the result of evaluating a suggested X value was NaN or an error? At the moment I'm returning a very large cost to mark it as a very bad X value but I think that will not work well with normalized Y values.

@apaleyes
Copy link
Collaborator

apaleyes commented Dec 13, 2017

Nope, there is nothing built into the library for that. Although it is not very clear what do you mean by "signal", can you elaborate please?

There are few options I can think of anyways:

  1. Subclass "Objective" or even "SingleObjective" with your own code that handles execution issues
  2. If you know problematic points beforehand, API now allows you to specify those, and they will be ignored
  3. You can control the loop and objective evaluation yourself and use GPyOpt to suggest candidate x only

@mtanti
Copy link
Author

mtanti commented Dec 17, 2017

Thank you for your reply.

By signal I mean something like what is done in the library hyperopt which requires the objective function to return a dictionary of values such as { 'status': 'ok', 'loss': 2.3 }. That way if there is an invalid value you can return { 'status': 'invalid' } which will make the algorithm ignore that data point. Is there a way to do the same in GPyOpt?

Do you know of any issues with returning very large losses for invalid values?

Regarding your solutions:

  1. The problem isn't handling execution issues but ignoring points which result in invalid values. So if a set of hyperparameters results in an invalid value then I don't want it to return anything and to instead try a different set of values.

  2. If you're using GPyOpt to tune hyperparameters for a neural network then you would not know which values result in your network ending up giving you 'NaN' values from beforehand.

  3. This seems to be a solution but it's also sort of creating my own derivative library. It would be nice to add this simple feature to GPyOpt. It doesn't need to break any existing code. Just make the library allow the objective function to return NaN or inf or None and ignore that value if it does.

@javiergonzalezh
Copy link
Member

That's a good point. If an evaluation of the objective gives an NaN is should automatically be saved as an infeasible point. That would be an interesting feature to have. Is that something that you have implemented already? If so, it would be great if you cold make a PR with that. We are happy to help with the integration.

@cal58
Copy link

cal58 commented Jun 19, 2018

Hi,

Has this been implemented? I've been struggling to get GPyOpt to find good parameters and I'm wondering if the fact my loss is often NaN is the issue.

@beldaz
Copy link

beldaz commented Dec 23, 2019

I got a suggestion to a related issue in #234 that may help. I've yet to make use of suggest_next_locations myself so I can't currently provide an example of how to use it, but perhaps it's enough to get you started.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants