Files
open-webui-docs/docs/intro.md
Timothy J. Baek 39ffc58830 refac
2024-05-08 10:02:51 -07:00

4.2 KiB

sidebar_position, slug, title, hide_title
sidebar_position slug title hide_title
0 / 🏡 Home true

Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit Hits Discord

Open WebUI Demo

Quick Start with Docker 🐳

:::warning When using Docker to install Open WebUI, make sure to include the -v open-webui:/app/backend/data in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. :::

Installation with Default Configuration

  • If Ollama is on your computer, use this command:

    docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    
  • If Ollama is on a Different Server, use this command:

    To connect to Ollama on another server, change the OLLAMA_BASE_URL to the server's URL:

    docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    
    • To run Open WebUI with Nvidia GPU support, use this command:
    docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
    

Installation for OpenAI API Usage Only

  • If you're only using OpenAI API, use this command:

    docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    

Installing Open WebUI with Bundled Ollama Support

This installation method uses a single container image that bundles Open WebUI with Ollama, allowing for a streamlined setup via a single command. Choose the appropriate command based on your hardware setup:

  • With GPU Support: Utilize GPU resources by running the following command:

    docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
    
  • For CPU Only: If you're not using a GPU, use this command instead:

    docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
    

Both commands facilitate a built-in, hassle-free installation of both Open WebUI and Ollama, ensuring that you can get everything up and running swiftly.

After installation, you can access Open WebUI at http://localhost:3000. Enjoy! 😄

Troubleshooting

If you're facing various issues like "Open WebUI: Server Connection Error", see TROUBLESHOOTING for information on how to troubleshoot and/or join our Open WebUI Discord community.

Continue with the full getting started guide.