-
Notifications
You must be signed in to change notification settings - Fork 565
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save and load kernel in GPy in a sparse gaussian process regression #535
Comments
I believe this is being addressed by the new serialization framework mentioned in #547 - still in progress. It is the |
Can this be done for other models, like GPRegression, in the similar manner? |
@nbeuchat Were you able to eventually save your model/kernel using any of the new methods such as @mzwiessele Would appreciate if you can take a look at my issue and give me a clue on what I might be doing wrong. Thanks! |
@Amir-Arsalan I haven't used the new method at all as I haven't used the framework for a while now. However, back in August 2017, I could easily save the parameters as I've shown and I ended up creating a small module for that specific model containing just the |
Hi!
I hope this is the right place for this question. I have built and optimized a Sparse Gaussian Process Regression model using the GPy library. The documentation recommends to save the model as follow:
I am able to save the parameters of the model and recreate the model from them. However, I need to know in advance the kernel architecture that was used to build the model (defined in the function create_kernel below). To create and save the model, I do the following:
To load the model, I am doing the following at the moment. The problem is that I might not have access to the create_kernel function.
What is the best way to store the kernel for later use? The parameters of the kernel and the inducing inputs are stored in the gp_params.npy file but not the structure of the kernel. At the moment, I have to know which function was used to create the model which will not always be the case.
Thanks a lot for your help!
Nicolas
The text was updated successfully, but these errors were encountered: