mirror of
https://github.com/open-webui/docs.git
synced 2026-01-03 18:26:47 +07:00
88 lines
4.2 KiB
Markdown
88 lines
4.2 KiB
Markdown
---
|
|
sidebar_position: 0
|
|
slug: /
|
|
title: 🏡 Home
|
|
hide_title: true
|
|
---
|
|
|
|
# Open WebUI
|
|
|
|
**Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline.** It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
|
|
|
|

|
|

|
|

|
|

|
|

|
|

|
|

|
|

|
|
[](https://discord.gg/5rJgQTnV4s)
|
|
[](https://github.com/sponsors/tjbck)
|
|
|
|

|
|
|
|
## Quick Start with Docker 🐳
|
|
|
|
:::warning
|
|
When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
|
|
:::
|
|
|
|
### Installation with Default Configuration
|
|
|
|
- **If Ollama is on your computer**, use this command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
|
```
|
|
|
|
- **If Ollama is on a Different Server**, use this command:
|
|
|
|
To connect to Ollama on another server, change the `OLLAMA_BASE_URL` to the server's URL:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
|
```
|
|
|
|
- **To run Open WebUI with Nvidia GPU support**, use this command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
|
|
```
|
|
|
|
### Installation for OpenAI API Usage Only
|
|
|
|
- **If you're only using OpenAI API**, use this command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
|
```
|
|
|
|
### Installing Open WebUI with Bundled Ollama Support
|
|
|
|
This installation method uses a single container image that bundles Open WebUI with Ollama, allowing for a streamlined setup via a single command. Choose the appropriate command based on your hardware setup:
|
|
|
|
- **With GPU Support**:
|
|
Utilize GPU resources by running the following command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
|
|
```
|
|
|
|
- **For CPU Only**:
|
|
If you're not using a GPU, use this command instead:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
|
|
```
|
|
|
|
Both commands facilitate a built-in, hassle-free installation of both Open WebUI and Ollama, ensuring that you can get everything up and running swiftly.
|
|
|
|
After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
|
|
|
## Troubleshooting
|
|
|
|
If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s).
|
|
|
|
Continue with the full [getting started guide](/getting-started).
|