-
Hi, The apps looks so great. My issue during setup, it auto detect the local Ollama, but I have another remote ollama, that I want to use instead and I can't edit the config anymore. How to override? Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Oh interesting! If you turn off the local Ollama instance for a minute, the UI will detect the error, and offer an option to specify a custom URL. I didn't consider that someone might be running local and prefer remote. The simplest option is the disable temporarily route mentioned above. You can alternatively set OLLAMA_BASE_URL in |
Beta Was this translation helpful? Give feedback.
-
Yeah and I forgot I had one running locally for very small models on my laptop. I will try that. I like your setup/install way. Have Python apps and it's so great how you did it. Will give it a ride. |
Beta Was this translation helpful? Give feedback.
Oh interesting!
If you turn off the local Ollama instance for a minute, the UI will detect the error, and offer an option to specify a custom URL. I didn't consider that someone might be running local and prefer remote.
The simplest option is the disable temporarily route mentioned above. You can alternatively set OLLAMA_BASE_URL in
~/.kiln_ai/settings.yaml