Skip to content

Ollama is not local #111

Answered by scosman
Mehdi-Bl asked this question in Q&A
Discussion options

You must be logged in to vote

Oh interesting!

If you turn off the local Ollama instance for a minute, the UI will detect the error, and offer an option to specify a custom URL. I didn't consider that someone might be running local and prefer remote.

The simplest option is the disable temporarily route mentioned above. You can alternatively set OLLAMA_BASE_URL in ~/.kiln_ai/settings.yaml

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by scosman
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #107 on January 15, 2025 05:10.