-
-
Notifications
You must be signed in to change notification settings - Fork 476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker compose example using OLLAMA #949
Comments
The information is in the docs: https://docs.hoarder.app/Installation/docker#4-setup-openai under the blue dropdown that says: "
|
I tried that before posting a feature request. I couldn't get it to work. I also did some googling and looking into issues raised by other ollama users, it didnot work either. here is the compose i am using: services:
|
you'll have to post some logs from the hoarder container, otherwise we don't know what doesn't work. |
i got this one, it is working perfectly fine (on linux) but i had to pull the models manually using the general problem is that it is not trivial to reach the hosts services:
web:
image: ghcr.io/hoarder-app/hoarder:${HOARDER_VERSION:-release}
container_name: mind
restart: unless-stopped
ports:
- "3000:3000" # use this for running on localhost, can be removed when using traefik
volumes:
- ./data:/data
env_file:
- .env
environment:
MEILI_ADDR: http://meilisearch:7700
BROWSER_WEB_URL: http://chrome:9222
DATA_DIR: /data
networks:
- hoarder-net
chrome:
image: gcr.io/zenika-hub/alpine-chrome:123
restart: unless-stopped
command:
- --no-sandbox
- --disable-gpu
- --disable-dev-shm-usage
- --remote-debugging-address=0.0.0.0
- --remote-debugging-port=9222
- --hide-scrollbars
networks:
- hoarder-net
meilisearch:
image: getmeili/meilisearch:v1.11.1
restart: unless-stopped
env_file:
- .env
environment:
MEILI_NO_ANALYTICS: "true"
volumes:
- ./meilisearch:/meili_data
networks:
- hoarder-net
ollama:
volumes:
- ./ollama/ollama:/root/.ollama
container_name: ollama
pull_policy: always
tty: true
# entrypoint: /bin/bash -c "ollama pull moondream && ollama pull phi3:3.8b && ollama pull snowflake-arctic-embed2 && tail -f /dev/null" # this did sadly not work
restart: unless-stopped
image: ollama/ollama:latest
networks:
- hoarder-net
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
networks:
hoarder-net:
name: hoarder-net
driver: bridge |
Describe the feature you'd like
an alternatve docker compose example with usage instead of OpenAI api key will be much appreciated.
Describe the benefits this would bring to existing Hoarder users
Ease in setting up local LLM for tagging propose.
Private
Can the goal of this request already be achieved via other means?
Yes, but i have not been able to do it.
Have you searched for an existing open/closed issue?
Additional context
No response
The text was updated successfully, but these errors were encountered: