Skip to content

BT-0022: AI Stack — Ollama + Open WebUI + Stable Diffusion#263

Open
sungdark wants to merge 3 commits intoillbnm:masterfrom
sungdark:feature/BT-0022-ai-stack
Open

BT-0022: AI Stack — Ollama + Open WebUI + Stable Diffusion#263
sungdark wants to merge 3 commits intoillbnm:masterfrom
sungdark:feature/BT-0022-ai-stack

Conversation

@sungdark
Copy link

Summary

Implements the AI Stack for the homelab-stack project — a complete local AI inference platform.

Changes

docker-compose.yml

  • Added 4 services:
    • Ollama — LLM inference engine
    • Open WebUI — Chat UI, pre-connected to Ollama
    • Stable Diffusion — Image generation (optional, --profile sd)
    • Perplexica ghcr.io/itzcrazykns1337/perplexica:latest — AI-powered search engine
  • GPU-adaptive via deploy.resources.reservations.devices for Ollama + Stable Diffusion
  • Healthchecks on all 4 containers; open-webui waits for ollama with condition: service_healthy
  • Traefik labels with letsencrypt TLS on all services
  • Security: no-new-privileges:true on all containers
  • Watchtower auto-update enabled via com.centurylinklabs.watchtower.enable=true
  • Networks: internal ai bridge + external proxy (Traefik)

.env.example

  • Added ENABLE_GPU (cpu/nvidia/amd), OLLAMA_PORT, SD_ARGS, SEARCH_MODELS, SIMILARITY_MODEL
  • Full documentation of all variables

README.md

  • Architecture diagram
  • GPU mode selection table (CPU / NVIDIA CUDA / AMD ROCm)
  • Deployment instructions for all GPU modes
  • Routing table for all 4 services
  • Troubleshooting section

GPU Configuration

Mode ENABLE_GPU Command
CPU only cpu (default) docker compose up -d
NVIDIA GPU nvidia ENABLE_GPU=nvidia docker compose up -d
AMD GPU amd ENABLE_GPU=amd docker compose up -d
+ Stable Diffusion docker compose --profile sd up -d

Related Issues

Checklist

  • YAML validates (docker compose config)
  • All 4 services have healthchecks
  • README is complete and accurate
  • .env.example documents all variables

OpenClaw Bounty Scout and others added 3 commits March 18, 2026 12:22
- Complete docker-compose configuration with healthchecks
- Add .env.example for environment configuration
- Add comprehensive README documentation following the project pattern
- Includes all required security hardening: no-new-privileges, watchtower auto-update enabled
- Ready for deployment
- 4 services: Ollama, Open WebUI, Stable Diffusion, Perplexica
- GPU-adaptive: NVIDIA CUDA, AMD ROCm, CPU fallback via ENABLE_GPU
- Healthchecks on all containers with service_healthy dependencies
- Traefik labels with letsencrypt TLS on all services
- Security: no-new-privileges, Watchtower auto-update
- Internal 'ai' network + external 'proxy' network
- Comprehensive README with architecture, routing, and troubleshooting
- .env.example with all GPU mode options documented

Closes illbnm#6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BOUNTY $220] AI Stack — Ollama + Open WebUI + Stable Diffusion

1 participant