12 KiB
sidebar_position, title
| sidebar_position | title |
|---|---|
| 3 | 🛠️ Troubleshooting |
Troubleshooting
Understanding the Open WebUI Architecture
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
-
How it Works: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via
/ollamaroute. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in theOLLAMA_BASE_URLenvironment variable. Therefore, a request made to/ollamain the WebUI is effectively the same as making a request toOLLAMA_BASE_URLin the backend. For instance, a request to/ollama/api/tagsin the WebUI is equivalent toOLLAMA_BASE_URL/api/tagsin the backend. -
Security Benefits: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
Open WebUI: Server Connection Error
If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the --network=host flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: http://localhost:8080.
Example Docker Command:
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
If you're experiencing connection issues with the SSL error of huggingface.co, please checked the huggingface server, if it is down, you could set the HF_ENDPOINT to https://hf-mirror.com/ in the docker run command.
docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
If you're using Podman on MacOS, to reach Ollama running on your computer you must enable the host loopback with --network slirp4netns:allow_host_loopback=true and override OLLAMA_BASE_URL to http://host.containers.internal:11434. The Open WebUI link remains the default: http://localhost:3000.
Example Podman Command:
podman run -d --network slirp4netns:allow_host_loopback=true -p 3000:8080 -e OLLAMA_BASE_URL=http://host.containers.internal:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
Difficulty Accessing Ollama from Open WebUI
If you're encountering difficulties accessing Ollama from the Open WebUI interface, it could be due to Ollama being configured to listen on a restricted network interface by default. To enable access from the Open WebUI, you need to configure Ollama to listen on a broader range of network interfaces.
Follow these steps to adjust the Ollama configuration:
-
Configure Ollama Host: Set the
OLLAMA_HOSTenvironment variable to0.0.0.0. This tells Ollama to listen on all available network interfaces, enabling connections from external sources, including the Open WebUI. -
Modify Ollama Environment Variables: Depending on how you're running Ollama, you may need to adjust the environment variables accordingly. If you're running Ollama in a Docker container, ensure that the
OLLAMA_HOSTvariable is correctly set within the container environment. For other deployment methods, refer to the respective documentation for instructions on setting environment variables. -
Restart Ollama: After modifying the environment variables, restart the Ollama service to apply the changes. This ensures that Ollama begins listening on the specified network interfaces.
Once Ollama is configured to listen on 0.0.0.0, you should be able to access it from the Open WebUI without any issues.
For detailed instructions on setting environment variables for Ollama, refer to the official Ollama documentation.
General Connection Errors
Ensure Ollama Version is Up-to-Date: Always start by checking that you have the latest version of Ollama. Visit Ollama's official site for the latest updates.
Troubleshooting Steps:
- Verify Ollama URL Format:
- When running the Web UI container, ensure the
OLLAMA_BASE_URLis correctly set. (e.g.,http://192.168.1.1:11434for different host setups). - In the Open WebUI, navigate to "Settings" > "General".
- Confirm that the Ollama Server URL is correctly set to
[OLLAMA URL](e.g.,http://localhost:11434).
- When running the Web UI container, ensure the
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
Network Diagrams of different deployments
Mac OS/Windows - Ollama on Host, Open WebUI in container
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Component(ollama, "Ollama", "Listening on port 11434")
}
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Mac OS/Windows - Ollama and Open WebUI in the same Compose stack
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Boundary(b2, "Compose Stack") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
Component(ollama, "Ollama", "Listening on port 11434")
}
}
}
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
UpdateRelStyle(openwebui, ollama, $offsetX="-100", $offsetY="-50")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Mac OS/Windows - Ollama and Open WebUI in containers, in different networks
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Boundary(b2, "Network A") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Boundary(b3, "Network B") {
Component(ollama, "Ollama", "Listening on port 11434")
}
}
}
Rel(openwebui, ollama, "Unable to connect")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Mac OS/Windows - Open WebUI in host network
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
}
Rel(user, openwebui, "Unable to connect, host network is the VM's network")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Linux - Ollama on Host, Open WebUI in container
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Boundary(b1, "Container Network") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Component(ollama, "Ollama", "Listening on port 11434")
}
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Linux - Ollama and Open WebUI in the same Compose stack
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Boundary(b1, "Container Network") {
Boundary(b2, "Compose Stack") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
Component(ollama, "Ollama", "Listening on port 11434")
}
}
}
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
UpdateRelStyle(openwebui, ollama, $offsetX="-100", $offsetY="-50")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Linux - Ollama and Open WebUI in containers, in different networks
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Boundary(b2, "Container Network A") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Boundary(b3, "Container Network B") {
Component(ollama, "Ollama", "Listening on port 11434")
}
}
Rel(openwebui, ollama, "Unable to connect")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Linux - Open WebUI in host network, Ollama on host
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Component(openwebui, "Open WebUI", "Listening on port 8080")
Component(ollama, "Ollama", "Listening on port 11434")
}
Rel(openwebui, ollama, "Makes API calls via localhost", "http://localhost:11434")
Rel(user, openwebui, "Makes requests via listening port", "http://localhost:8080")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
Reset Admin Password
If you've forgotten your admin password, you can reset it by following these steps:
Reset Admin Password in Docker
To reset the admin password for Open WebUI in a Docker deployment, generate a bcrypt hash of your new password and run a Docker command to update the database. Replace your-new-password with the desired password and execute:
-
Generate bcrypt hash (local machine):
htpasswd -bnBC 10 "" your-new-password | tr -d ':\n' -
Update password in Docker (replace
HASHandadmin@example.com):docker run --rm -v open-webui:/data alpine/socat EXEC:"bash -c 'apk add sqlite && echo UPDATE auth SET password='\''HASH'\'' WHERE email='\''admin@example.com'\''; | sqlite3 /data/webui.db'", STDIO
Reset Admin Password Locally
For local installations of Open WebUI, navigate to the open-webui directory and update the password in the backend/data/webui.db database.
-
Generate bcrypt hash (local machine):
htpasswd -bnBC 10 "" your-new-password | tr -d ':\n' -
Update password locally (replace
HASHandadmin@example.com):sqlite3 backend/data/webui.db "UPDATE auth SET password='HASH' WHERE email='admin@example.com';"