-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ExactGP.predict and viGP.predict produce inconsistent shapes #49
Comments
Happy to attempt a fix if you'd like. |
In the beginning, both ExactGP and viGP used to output identical shapes. However, while That said, there is certainly an "asymmetry" between fully Bayesian and variational inference tools currently available. I need to think a bit more about how to address it from the design point of view. |
@ziatdinovmax I see what you're saying but ultimately doesn't it make more sense from a design standpoint to have a common interface? It's going to make building tools on top of what you already have really difficult if each of the GP's IMO making this really clear in the docs but implementing the consistent method is the way to go. That way, a GP that inherits some base ABC has totally known behavior when calling |
@matthewcarbone - adding it to the v0.3 'milestone' per our discussion |
@ziatdinovmax yup sounds good. To be clear this would actually be a backwards-incompatible change (technically), since the shape of the |
I guess the |
In fact, we can have |
Perfect, I really like the idea of |
The
predict
method onExactGP
andviGP
produce results of different shapes.ExactGP.predict
produces a 3-tensor, e.g.(2000, 200, 100)
.viGP.predict
produces a 1-tensor, e.g.(100,)
.Is there any way to standardize the output of these methods? Also, it appears 2000 is the number of samples after warmup, and 200 samples from the posterior. Maybe the output of
viGP.predict
should be(1, 200, 100)
to make it consistent (since there's only a single value for e.g.samples["k_length"]
). This should be easy enough to do by just usingmean, cov = self.get_mvn_posterior(X_new, samples, noiseless, **kwargs)
to draw 200 samples, I think. Let me know if I have this totally wrong.In addition, it begs the question if a ABC for GPs should really be being used. It would probably be best for the user if, in all cases possible, every core method of each GP produces the same type of object. I see that
viGP
inheritsExactGP
, but it might be best to haveExactGP
(or whatever the most base class is) inherit from some ABC.The text was updated successfully, but these errors were encountered: