mirror of
https://github.com/open-webui/docs.git
synced 2025-12-12 07:29:49 +07:00
129 lines
5.4 KiB
Plaintext
129 lines
5.4 KiB
Plaintext
---
|
|
sidebar_position: 0
|
|
slug: /
|
|
title: 🏡 Home
|
|
hide_title: true
|
|
---
|
|
|
|
import { TopBanners } from "@site/src/components/TopBanners";
|
|
import { SponsorList } from "@site/src/components/SponsorList";
|
|
|
|
# Open WebUI
|
|
|
|
|
|
**Open WebUI is an [extensible](https://docs.openwebui.com/features/plugin/), feature-rich, and user-friendly self-hosted AI interface designed to operate entirely offline.** It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
|
|
|
|

|
|

|
|

|
|

|
|

|
|

|
|

|
|

|
|
[](https://discord.gg/5rJgQTnV4s)
|
|
[](https://github.com/sponsors/tjbck)
|
|
|
|

|
|
|
|
|
|
## Quick Start with Docker 🐳
|
|
|
|
### Installation with Default Configuration
|
|
|
|
- **If Ollama is on your computer**, use this command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
|
```
|
|
|
|
- **To run Open WebUI with Nvidia GPU support**, use this command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
|
|
```
|
|
|
|
### Installing Open WebUI with Bundled Ollama Support
|
|
|
|
This installation method uses a single container image that bundles Open WebUI with Ollama, allowing for a streamlined setup via a single command. Choose the appropriate command based on your hardware setup:
|
|
|
|
- **With GPU Support**:
|
|
Utilize GPU resources by running the following command:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
|
|
```
|
|
|
|
- **For CPU Only**:
|
|
If you're not using a GPU, use this command instead:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
|
|
```
|
|
|
|
Both commands facilitate a built-in, hassle-free installation of both Open WebUI and Ollama, ensuring that you can get everything up and running swiftly.
|
|
|
|
After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
|
|
|
### Using the Dev Branch 🌙
|
|
|
|
:::warning
|
|
The `:dev` branch contains the latest unstable features and changes. Use it at your own risk as it may have bugs or incomplete features.
|
|
:::
|
|
|
|
If you want to try out the latest bleeding-edge features and are okay with occasional instability, you can use the `:dev` tag like this:
|
|
|
|
```bash
|
|
docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:dev
|
|
```
|
|
|
|
## Manual Installation
|
|
|
|
### Installation with `pip` (Beta)
|
|
|
|
For users who prefer to use Python's package manager `pip`, Open WebUI offers a installation method. Python 3.11 is required for this method.
|
|
|
|
1. **Install Open WebUI**:
|
|
Open your terminal and run the following command:
|
|
|
|
```bash
|
|
pip install open-webui
|
|
```
|
|
|
|
2. **Start Open WebUI**:
|
|
Once installed, start the server using:
|
|
|
|
```bash
|
|
open-webui serve
|
|
```
|
|
|
|
This method installs all necessary dependencies and starts Open WebUI, allowing for a simple and efficient setup. After installation, you can access Open WebUI at [http://localhost:8080](http://localhost:8080). Enjoy! 😄
|
|
|
|
## Other Installation Methods
|
|
|
|
We offer various installation alternatives, including non-Docker native installation methods, Docker Compose, Kustomize, and Helm. Visit our [Open WebUI Documentation](https://docs.openwebui.com/getting-started/) or join our [Discord community](https://discord.gg/5rJgQTnV4s) for comprehensive guidance.
|
|
|
|
## Updating
|
|
|
|
Check out how to update Docker in the [Quick Start guide](./getting-started/quick-start).
|
|
|
|
In case you want to update your local Docker installation to the latest version, you can do it with [Watchtower](https://containrrr.dev/watchtower/):
|
|
|
|
```bash
|
|
docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui
|
|
```
|
|
|
|
In the last part of the command, replace `open-webui` with your container name if it is different.
|
|
|
|
Continue with the full [getting started guide](/getting-started).
|
|
|
|
## Sponsors 🙌
|
|
|
|
<TopBanners />
|
|
|
|
<SponsorList />
|
|
|
|
|
|
We are incredibly grateful for the generous support of our sponsors. Their contributions help us to maintain and improve our project, ensuring we can continue to deliver quality work to our community. Thank you!
|
|
|