You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do you plan to add Ollama support in AgentKit?
It would enable me to use your product at my work. As a Swiss State we can only use LLMs locally hosted in our own infrastructure.
Cheers!
JC
The text was updated successfully, but these errors were encountered:
Hey @JCPOCSIN - we can add Ollama support given the right config options. The models are currently defined in our main JS repo and imported through Agent Kit. We plan to extract those into a neutral package at some point.
Hi @djfarrelly ,
For now it's on a lab server in a closed network directly accessible in HTTP on the standard port 11434. I also use a local instance on http://127.0.0.1:11434
Hi team!
Do you plan to add Ollama support in AgentKit?
It would enable me to use your product at my work. As a Swiss State we can only use LLMs locally hosted in our own infrastructure.
Cheers!
JC
The text was updated successfully, but these errors were encountered: