Skip to main content

Ollama + Open WebUI

Docker Hub | Vidéo Youtube

---
version: "3.6"
services:
    ollama:
        image: ollama/ollama
        restart: unless-stopped
        container_name: ollama
        ports:
            - '11434:11434'
        volumes:
            - '/srv/Files/OllamaGUI/ollama:/root/.ollama'
    open-webui:
        image: 'ghcr.io/open-webui/open-webui:main'
        restart: unless-stopped
        container_name: open-webui
        volumes:
            - '/srv/Files/OllamaGUI/open-webui:/app/backend/data'
        environment:
            - 'OLLAMA_BASE_URL=http://192.168.86.2:11434/' #machine IP
            - 'RAG_EMBEDDING_MODEL_AUTO_UPDATE=True' #if not loading
        ports:
            - '2039:8080'
docker exec -it ollama ollama run <model>

Or Portainer, Ollama container > exec console >ollama run <model>

A good one is ollama run mistral:7b-instruct-q4_0

mistral:7b-instruct-q4_0
mixtral:8x7b
llama2
codegemma:7b-code
openchat

Models list

More info