Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama support #38

Open
JCPOCSIN opened this issue Jan 8, 2025 · 2 comments
Open

Ollama support #38

JCPOCSIN opened this issue Jan 8, 2025 · 2 comments
Labels
enhancement New feature or request

Comments

@JCPOCSIN
Copy link

JCPOCSIN commented Jan 8, 2025

Hi team!

Do you plan to add Ollama support in AgentKit?
It would enable me to use your product at my work. As a Swiss State we can only use LLMs locally hosted in our own infrastructure.

Cheers!
JC

@djfarrelly
Copy link
Member

Hey @JCPOCSIN - we can add Ollama support given the right config options. The models are currently defined in our main JS repo and imported through Agent Kit. We plan to extract those into a neutral package at some point.

https://github.com/inngest/inngest-js/tree/main/packages/inngest/src/components/ai/models

Is your Ollama model open to the internet w/ a base URL and auth, or is it on the closed network?

@djfarrelly djfarrelly added the enhancement New feature or request label Jan 10, 2025
@JCPOCSIN
Copy link
Author

Hi @djfarrelly ,
For now it's on a lab server in a closed network directly accessible in HTTP on the standard port 11434. I also use a local instance on http://127.0.0.1:11434

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants