-
Notifications
You must be signed in to change notification settings - Fork 565
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how can I use the hmc method to approximate the non gaussian likelihood, thank you #554
Comments
I think there is an example notebook about hmc in GPy. Zhenwen should know
more about the notebook.
…On 24 September 2017 at 01:23, Kuan Lu(Frank) ***@***.***> wrote:
Reopened #554 <#554>.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#554 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AE4H0Ln1gcEhSDprc48nZU9z4AwYqj07ks5slaEdgaJpZM4PhucU>
.
|
@mu2013 : Do you mean this "http://nbviewer.jupyter.org/github/SheffieldML/notebook/blob/master/GPy/sampling_hmc.ipynb"? I know that tutorial, but I find it impossible to set the likelihood arbitrarily. |
How would you like to handle the intractable integral of the non-Gaussian likelihood? You can run HMC for a non-Gaussian likelihood GP with Laplace approximation, but the samples are biased because of the approximation. (This can be done with GPy.) Alternatively, you can run HMC without marginalizing the output variable of GP prior, f, but it results in very high HMC sampling problem. (This is not implemented.) |
@zhenwendai Thank you for your reply. I am a new learner for GP. I can't catch all you have said above. For example "run HMC for a non-Gaussian likelihood GP with Laplace approximation". The reason I ask this question is that I find it possible in GPflow to use GPMC to make mcmc inference "http://gpflow.readthedocs.io/en/latest/notebooks/mcmc.html" , where I can set the likelihood arbitrarily. I just wander why I can't do this using GPy. |
According the link that you provided, GPFlow does the second way that I mentioned previously. Its HMC sampler draw samples for f and model parameters jointly, which is typically super high dimensional. Unfortunately, GPy does not support it at the moment, because we by default focus on the model with the latent function f marginalized out (apprioximately). |
HMC require the gradient of likelihood. But if you use mcmc with metroplis
hasting you do not need it. For non-Gaussian likelihood case, what we can
do is similar to Williams, C. K. and D. Barber (1998). Bayesian
classification with gaussian processes.
you can check out that paper for some details.
cheers
…On 27 September 2017 at 21:41, Zhenwen Dai ***@***.***> wrote:
According the link that you provided, GPFlow does the second way that I
mentioned previously. Its HMC sampler draw samples for f and model
parameters jointly, which is typically super high dimensional.
Unfortunately, GPy does not support it at the moment, because we by
default focus on the model with the latent function f marginalized out
(apprioximately).
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#554 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AE4H0JGjqBaar6qYnLyNsoIqQBOSWJsYks5smrL4gaJpZM4PhucU>
.
|
I set the Possion distribution pdf as my GP likelihood, and want to use hmc method to infer its parameter. Here is my code as follows:
the error shows:
So I want to know is there anyway that I can use the hmc to infer the parameters of an arbitrary likelihood? Thank you for your help!!!
The text was updated successfully, but these errors were encountered: