Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Add Docker protip and Docker Compose example #1858

Merged
merged 2 commits into from
Oct 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/assets/stylesheets/extra.css
Original file line number Diff line number Diff line change
Expand Up @@ -1040,9 +1040,9 @@ html .md-footer-meta.md-typeset a:is(:focus,:hover) {

.md-typeset .tabbed-labels--linked>label>a code {
/*MKDocs Insiders fix*/
/*background: initial;*/
font-weight: 600;
color: var(--md-primary-fg-color);
background: initial;
font-weight: 700;
color: var(--md-typeset-color);
}

.md-typeset .highlight :is(.nd,.ni,.nl,.nt),
Expand Down
70 changes: 70 additions & 0 deletions docs/docs/guides/protips.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,76 @@ This allows you to access the remote `8501` port on `localhost:8501` while the C
production-grade service deployment not offered by tasks, such as HTTPS domains and auto-scaling.
If you run a web app as a task and it works, go ahead and run it as a service.

## Docker and Docker Compose

All backends except `runpod`, `vastai` and `kubernetes` allow to use Docker and Docker Compose
inside `dstack` runs. To do that, additional configuration steps are required:

1. Set the `privileged` property to `true`.
2. Set the `image` property to `dstackai/dind` (or another DinD image).
3. For tasks and services, add `start-dockerd` as the first command. For dev environments, add `start-dockerd` as the first comand
in the `init` property.
Note, `start-dockerd` is a part of `dstackai/dind` image, if you use a different DinD image,
replace it with a corresponding command to start Docker daemon.

=== "Task"
<div editor-title="examples/misc/dind/task.dstack.yml">

```yaml
type: task
name: task-dind

privileged: true
image: dstackai/dind

commands:
- start-dockerd
- docker compose up
```

</div>

=== "Dev environment"
<div editor-title="examples/misc/dind/.dstack.yml">

```yaml
type: dev-environment
name: vscode-dind

privileged: true
image: dstackai/dind

ide: vscode

init:
- start-dockerd
```

</div>

??? info "Volumes"

To persist Docker data between runs (e.g. images, containers, volumes, etc), create a `dstack` [volume](../concepts/volumes.md)
and add attach it in your run configuration:

```yaml
type: dev-environment
name: vscode-dind

privileged: true
image: dstackai/dind
ide: vscode

init:
- start-dockerd

volumes:
- name: docker-volume
path: /var/lib/docker
```

See more Docker examples [here](https://github.com/dstackai/dstack/tree/master/examples/misc/docker-compose).

## Environment variables

If a configuration requires an environment variable that you don't want to hardcode in the YAML, you can define it
Expand Down
8 changes: 7 additions & 1 deletion docs/docs/reference/dstack.yml/dev-environment.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,9 @@ ide: vscode
ide: vscode
```

### Docker image
### Docker

If you want, you can specify your own Docker image via `image`.

<div editor-title=".dstack.yml">

Expand Down Expand Up @@ -82,6 +84,10 @@ ide: vscode
ide: vscode
```

!!! info "Docker and Docker Compose"
All backends except `runpod`, `vastai` and `kubernetes` also allow to use [Docker and Docker Compose](../../guides/protips.md#docker-and-docker-compose)
inside `dstack` runs.

### Resources { #_resources }

If you specify memory size, you can either specify an explicit size (e.g. `24GB`) or a
Expand Down
8 changes: 7 additions & 1 deletion docs/docs/reference/dstack.yml/service.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,9 @@ port: 8000

</div>

### Docker image
### Docker

If you want, you can specify your own Docker image via `image`.

<div editor-title="service.dstack.yml">

Expand Down Expand Up @@ -102,6 +104,10 @@ port: 8000
port: 8000
```

!!! info "Docker and Docker Compose"
All backends except `runpod`, `vastai` and `kubernetes` also allow to use [Docker and Docker Compose](../../guides/protips.md#docker-and-docker-compose)
inside `dstack` runs.

### Model gateway { #model-mapping }

By default, if you run a service, its endpoint is accessible at `https://<run name>.<gateway domain>`.
Expand Down
8 changes: 7 additions & 1 deletion docs/docs/reference/dstack.yml/task.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,9 @@ When running it, `dstack run` forwards `6000` port to `localhost:6000`, enabling

[//]: # (See [tasks]&#40;../../tasks.md#configure-ports&#41; for more detail.)

### Docker image
### Docker

If you want, you can specify your own Docker image via `image`.

<div editor-title=".dstack.yml">

Expand Down Expand Up @@ -123,6 +125,10 @@ commands:
- python fine-tuning/qlora/train.py
```

!!! info "Docker and Docker Compose"
All backends except `runpod`, `vastai` and `kubernetes` also allow to use [Docker and Docker Compose](../../guides/protips.md#docker-and-docker-compose)
inside `dstack` runs.

### Resources { #_resources }

If you specify memory size, you can either specify an explicit size (e.g. `24GB`) or a
Expand Down
Empty file.
13 changes: 13 additions & 0 deletions examples/misc/docker-compose/.dstack.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
type: dev-environment
name: vscode-dind

privileged: true
image: dstackai/dind
ide: vscode
init:
- start-dockerd

spot_policy: auto

resources:
gpu: 1
185 changes: 185 additions & 0 deletions examples/misc/docker-compose/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,185 @@
# Docker Compose

All backends except `runpod`, `vastai` and `kubernetes` allow to use Docker and Docker Compose
inside `dstack` runs.

This example shows how to deploy Hugging Face [Chat UI :material-arrow-top-right-thin:{ .external }](https://huggingface.co/docs/chat-ui/index){:target="_blank"}
with [TGI :material-arrow-top-right-thin:{ .external }](https://huggingface.co/docs/text-generation-inference/en/index){:target="_blank"}
serving [Llama-3.2-3B-Instruct :material-arrow-top-right-thin:{ .external }](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct){:target="_blank"}
using [Docker Compose :material-arrow-top-right-thin:{ .external }](https://docs.docker.com/compose/){:target="_blank"}.

??? info "Prerequisites"
Once `dstack` is [installed](https://dstack.ai/docs/installation), go ahead clone the repo, and run `dstack init`.

<div class="termy">

```shell
$ git clone https://github.com/dstackai/dstack
$ cd dstack
$ dstack init
```

</div>

## Deployment

### Running as a task

=== "`task.dstack.yml`"

<div editor-title="examples/misc/docker-compose/task.dstack.yml">

```yaml
type: task
name: chat-ui-task

privileged: true
image: dstackai/dind
env:
- MODEL_ID=meta-llama/Llama-3.2-3B-Instruct
- HF_TOKEN
commands:
- start-dockerd
- docker compose up
ports:
- 9000

# Use either spot or on-demand instances
spot_policy: auto

resources:
# Required resources
gpu: "NVIDIA:16GB.."
```

</div>

=== "`compose.yml`"

<div editor-title="examples/misc/docker-compose/compose.yml">

```yaml
services:
app:
image: ghcr.io/huggingface/chat-ui:sha-bf0bc92
command:
- bash
- -c
- |
echo MONGODB_URL=mongodb://db:27017 > .env.local
echo MODELS='`[{
"name": "${MODEL_ID?}",
"endpoints": [{"type": "tgi", "url": "http://tgi:8000"}]
}]`' >> .env.local
exec ./entrypoint.sh
ports:
- 127.0.0.1:9000:3000
depends_on:
- tgi
- db

tgi:
image: ghcr.io/huggingface/text-generation-inference:sha-704a58c
volumes:
- tgi_data:/data
environment:
HF_TOKEN: ${HF_TOKEN?}
MODEL_ID: ${MODEL_ID?}
PORT: 8000
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]

db:
image: mongo:latest
volumes:
- db_data:/data/db

volumes:
tgi_data:
db_data:
```

</div>

### Deploying as a service

If you'd like to deploy Chat UI as an auto-scalable and secure endpoint,
use the service configuration. You can find it at [`examples/misc/docker-compose/service.dstack.yml` :material-arrow-top-right-thin:{ .external }](https://github.com/dstackai/dstack/blob/master/examples/misc/docker-compose/service.dstack.yml)

### Running a configuration

To run a configuration, use the [`dstack apply`](https://dstack.ai/docs/reference/cli/index.md#dstack-apply) command.

<div class="termy">

```shell
$ HUGGING_FACE_HUB_TOKEN=...

$ dstack apply -f examples/examples/misc/docker-compose/task.dstack.yml

# BACKEND REGION RESOURCES SPOT PRICE
1 runpod CA-MTL-1 18xCPU, 100GB, A5000:24GB yes $0.12
2 runpod EU-SE-1 18xCPU, 100GB, A5000:24GB yes $0.12
3 gcp us-west4 27xCPU, 150GB, A5000:24GB:2 yes $0.23

Submit the run chat-ui-task? [y/n]: y

Provisioning...
---> 100%
```

</div>

## Persisting data

To persist data between runs, create a [volume](https://dstack.ai/docs/concepts/volumes/) and attach it to the run
configuration.

<div editor-title="examples/misc/docker-compose/task.dstack.yml">

```yaml
type: task
name: chat-ui-task

privileged: true
image: dstackai/dind
env:
- MODEL_ID=meta-llama/Llama-3.2-3B-Instruct
- HF_TOKEN
commands:
- start-dockerd
- docker compose up
ports:
- 9000

# Use either spot or on-demand instances
spot_policy: auto

resources:
# Required resources
gpu: "NVIDIA:16GB.."

volumes:
- name: my-dind-volume
path: /var/lib/docker
```

</div>

With this change, all Docker data—pulled images, containers, and crucially, volumes for database and model storage—will
be persisted.

## Source code

The source-code of this example can be found in
[`examples/misc/docker-compose` :material-arrow-top-right-thin:{ .external }](https://github.com/dstackai/dstack/blob/master/examples/misc/docker-compose).

## What's next?

1. Check [dev environments](https://dstack.ai/docs/dev-environments), [tasks](https://dstack.ai/docs/tasks),
[services](https://dstack.ai/docs/services), and [protips](https://dstack.ai/docs/protips).
43 changes: 43 additions & 0 deletions examples/misc/docker-compose/compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
services:
app:
image: ghcr.io/huggingface/chat-ui:sha-bf0bc92
command:
- bash
- -c
- |
echo MONGODB_URL=mongodb://db:27017 > .env.local
echo MODELS='`[{
"name": "${MODEL_ID?}",
"endpoints": [{"type": "tgi", "url": "http://tgi:8000"}]
}]`' >> .env.local
exec ./entrypoint.sh
ports:
- 127.0.0.1:9000:3000
depends_on:
- tgi
- db

tgi:
image: ghcr.io/huggingface/text-generation-inference:sha-704a58c
volumes:
- tgi_data:/data
environment:
HF_TOKEN: ${HF_TOKEN?}
MODEL_ID: ${MODEL_ID?}
PORT: 8000
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]

db:
image: mongo:latest
volumes:
- db_data:/data/db

volumes:
tgi_data:
db_data:
Loading