Skip to content

Instantly share code, notes, and snippets.

@maurerle
Created April 14, 2025 08:09
Show Gist options
  • Save maurerle/fe0be1ade69926cced6c7d4762436a64 to your computer and use it in GitHub Desktop.
Save maurerle/fe0be1ade69926cced6c7d4762436a64 to your computer and use it in GitHub Desktop.
OpenWebUi Docker Compose
services:
ollama:
image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest}
container_name: ollama
volumes:
- ./ollama:/root/.ollama
ports:
- "11434:11434/tcp"
restart: unless-stopped
tty: true
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1 # alternatively, use `count: all` for all GPUs
capabilities: [gpu]
open-webui:
image: ghcr.io/open-webui/open-webui:cuda
container_name: open-webui
volumes:
- ./open-webui:/app/backend/data
depends_on:
- ollama
ports:
- ${OPEN_WEBUI_PORT-3000}:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
- 'WEBUI_SECRET_KEY='
restart: unless-stopped
stable-diffusion-webui:
image: universonic/stable-diffusion-webui:minimal
command: --api --no-half --no-half-vae --precision full
runtime: nvidia
container_name: stable-diffusion
restart: unless-stopped
ports:
- "8080:8080/tcp"
volumes:
- ./stablediffusion/inputs:/app/stable-diffusion-webui/inputs
- ./stablediffusion/textual_inversion_templates:/app/stable-diffusion-webui/textual_inversion_templates
- ./stablediffusion/embeddings:/app/stable-diffusion-webui/embeddings
- ./stablediffusion/extensions:/app/stable-diffusion-webui/extensions
- ./stablediffusion/models:/app/stable-diffusion-webui/models
- ./stablediffusion/localizations:/app/stable-diffusion-webui/localizations
- ./stablediffusion/outputs:/app/stable-diffusion-webui/outputs
cap_drop:
- ALL
cap_add:
- NET_BIND_SERVICE
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1 # alternatively, use `count: all` for all GPUs
capabilities: [gpu]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment