-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to use other AI tools like Gemini, Claude or DeepSeek? #157
Comments
+1 to this request. I use LocalAI to run my LLMs, which uses the ChatGPT API. I would like to point ARIA to my LocalAI instance (on my LAN), but the preferences pane does not allow me to point the plugin to the LAN address. LocalAI's API documentation may be found here: LocalAI. It should be a drop-in replacement for ChatGPT. |
Hi @nurlan114 and @apstrom , you can connect to third-party LLMs, as long as they provide compatible APIs ("drop-in replacement"). Be aware that their behaviors may still be different from OpenAI's. Please see here on how to change the API end point: https://github.com/lifan0127/ai-research-assistant?tab=readme-ov-file#preferences |
Thanks @lifan0127. I appreciate your work on this and the timely response. The only issue with the preferences pane here is that it does not call the models available in LocalAI (not necessarily GPT4). A model selection function would be required to call the model list from LocalAI to the plugin. Here is an example of the function written for AnythingLLM, which integrates LocalAI and ChatGPT:
If a function similar to the above could be called from the preferences pane, the user could select their own custom models using LocalAI. This function also includes the ability to select embedding models, which is a separate feature in AnythingLLM. I've proposed some modifications to the plugin in an effort to implement the above. I have no skill in this field, so am proposing changes that may not fully address implementation in your code. #158 |
Hi @apstrom, thanks for the detailed explanation. If you only need to specify a different model name, it is perhaps easier to update the model name in Zotero config editor. If you need other customization, you may need to modify the code. I should also mention that Aria has switched to the OpenAI Assistants APIs in the latest development, to take advantage of the built-in vector store, message history (memory) and other features. If you plan to use 3rd-party LLMs, please ensure they provide compatible capabilities. |
It would be great if the ability to use Gemini, deepseek, mistral or local models could become part of the standard UI to enable non technical users to make this choice. Individual or institutional preferences e.g for Gemini or data confidentiality rules requiring the use of EU located Mistral or local, self hosted models. I understand that the OpenAI Api is a de facto standard but it would be important to keep supporting the standard API so that it is easier to plug other endpoints that conform with Open AI formatting. |
Thank you for developing this amazing plugin! It has been incredibly helpful in integrating AI capabilities with Zotero.
I noticed that the plugin currently supports ChatGPT, which is great. However, I was wondering if it would be possible to expand the plugin to support other AI tools, such as Gemini, Claude, or DeepSeek.
The text was updated successfully, but these errors were encountered: