-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does it support private deployment #8
Comments
Hey, I created a version of this app last week that addresses this https://github.com/WissamAntoun/GalacTex |
Thank you very much for your reply. I have carefully read this app. Is it currently only supported for Galactica models? Can I respond if I use URLs from other local models? For example: chatglm+vllm |
it was intended for galactica, but it should support any model supported by vllm |
Interesting. I was not aware of vLLM/Galactica. Yesterday, I released version 1.4.0, which should allow connecting any vLLM model. With this editor, you can change the URL and configure the hyperparameters for the model. You can do that for each command and for all of them. You will still need to set the API key to enable the plugin, but it doesn't have to be a real API key. |
What a great job! This extension will definitely become popular. |
Thanks! Let me know how if it works indeed. Help us spreading the word by staring the project and adding a review to the extension homepage. |
May I ask if this plugin supports local models packaged in OpenAI format using VLLM
The text was updated successfully, but these errors were encountered: