Files
docker-home-server/docker-compose/ollama/docker-compose.yml
sickprodigy 23e5987799 feat: update ollama service configuration for NVIDIA support and environment variables
Only way I could mount my p100 gpu to ollama in docker container. Still had to install extra stuff on debian that was for ubuntu. Didn't feel right, did it anyways. Worked.
2025-12-12 17:07:09 -05:00

29 lines
898 B
YAML

services:
ollama:
image: docker.io/ollama/ollama:latest
ports:
- 7869:11434
volumes:
- /docker-containers/ollama/code:/code
- /docker-containers/ollama/data:/root/.ollama
# - /usr/local/cuda:/usr/local/cuda:ro # <-- mount CUDA runtime from host maybe
container_name: ollama
tty: true
restart: always
environment:
- OLLAMA_KEEP_ALIVE=24h
- OLLAMA_HOST=0.0.0.0
- NVIDIA_VISIBLE_DEVICES=all
- NVIDIA_DRIVER_CAPABILITIES=compute,utility
# devices:
# - /dev/nvidia0:/dev/nvidia0
# - /dev/nvidiactl:/dev/nvidiactl
# - /dev/nvidia-uvm:/dev/nvidia-uvm
runtime: nvidia
networks:
- homelab
networks:
homelab:
name: homelab # Networks can also be given a custom name
external: true # This option causes compose to join the above network instead of making a _default one