mirror of
https://github.com/open-webui/docs.git
synced 2026-01-02 17:59:41 +07:00
2.8 KiB
2.8 KiB
sidebar_position, slug, title, hide_title
| sidebar_position | slug | title | hide_title |
|---|---|---|---|
| 0 | / | 🏡 Home | true |
Open WebUI
Open WebUI is ChatGPT-Style WebUI for various LLM runners, supported LLM runners include Ollama and OpenAI-compatible APIs.
Installing with Docker 🐳
-
Important: When using Docker to install Open WebUI, make sure to include the
-v open-webui:/app/backend/datain your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. -
If Ollama is on your computer, use this command:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main -
To build the container yourself, follow these steps:
docker build -t open-webui . docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always open-webui -
After installation, you can access Open WebUI at http://localhost:3000.
Using Ollama on a Different Server
-
To connect to Ollama on another server, change the
OLLAMA_API_BASE_URLto the server's URL:docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:mainOr for a self-built container:
docker build -t open-webui . docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always open-webui
Continue with the full getting started guide.