mirror of
https://github.com/open-webui/docs.git
synced 2026-03-27 13:28:37 +07:00
0.8.4
This commit is contained in:
@@ -12,6 +12,7 @@ The Prompts interface provides several key features for managing your custom pro
|
||||
* **Create**: Design new prompts with customizable names, tags, access levels, and content.
|
||||
* **Share**: Share prompts with other users based on configured access permissions.
|
||||
* **Access Control**: Set visibility and usage permissions for each prompt (refer to [Permissions](/features/access-security/rbac/permissions) for more details).
|
||||
* **Enable/Disable**: Toggle prompts on or off without deleting them. Disabled prompts won't appear in slash command suggestions but remain in your workspace for future use.
|
||||
* **Slash Commands**: Quickly access prompts using custom slash commands during chat sessions.
|
||||
* **Versioning & History**: Track every change with a full version history, allowing you to compare and revert to previous versions.
|
||||
* **Tags & Filtering**: Organize your prompts using tags and easily filter through your collection in the workspace.
|
||||
@@ -293,6 +294,7 @@ Prompt management is controlled by the following permission settings:
|
||||
* Document any specific requirements or expected inputs in the prompt description
|
||||
* Test prompts with different variable combinations, including scenarios where optional fields are left blank
|
||||
* Consider access levels carefully when sharing prompts with other users - public sharing means that it will appear automatically for all users when they hit `/` in a chat, so you want to avoid creating too many
|
||||
* **Use the enable/disable toggle** to temporarily deactivate prompts you're not currently using instead of deleting them — this preserves the prompt's configuration and version history
|
||||
* **Consider user workflows**: Think about which information users will always have versus what they might want to customize occasionally
|
||||
|
||||
### Migration Notes
|
||||
|
||||
@@ -0,0 +1,77 @@
|
||||
---
|
||||
sidebar_position: 3
|
||||
title: "Anthropic (Claude)"
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
Open WebUI supports **Anthropic's Claude models** natively through their OpenAI-compatible endpoint. Just plug in your API key and start chatting — no middleware, no pipes, no extra setup required.
|
||||
|
||||
Open WebUI includes a built-in compatibility layer that automatically detects Anthropic URLs, handles model discovery, and translates requests. Chat completions, streaming, and **tool calling** all work correctly out of the box.
|
||||
|
||||
:::warning Anthropic's Compatibility Disclaimer
|
||||
Anthropic states that their OpenAI-compatible API is intended **for evaluation and testing**, not production workloads. In practice, everything — including tool calling — works as intended, but be aware of this official stance. For full Claude-native features (PDF processing, citations, extended thinking, prompt caching), Anthropic recommends their native `/v1/messages` API, which can be accessed via a [pipe function](/features/extensibility/plugin/functions/pipe) or a proxy like [LiteLLM](https://docs.litellm.ai/).
|
||||
:::
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Get Your API Key
|
||||
|
||||
1. Go to [console.anthropic.com](https://console.anthropic.com/).
|
||||
2. Create an account or sign in.
|
||||
3. Navigate to **API Keys** and generate a new key.
|
||||
|
||||
---
|
||||
|
||||
## Step 2: Add the Connection in Open WebUI
|
||||
|
||||
1. Open Open WebUI in your browser.
|
||||
2. Go to ⚙️ **Admin Settings** → **Connections** → **OpenAI**.
|
||||
3. Click ➕ **Add Connection**.
|
||||
4. Enter the following:
|
||||
|
||||
| Setting | Value |
|
||||
|---|---|
|
||||
| **URL** | `https://api.anthropic.com/v1` |
|
||||
| **API Key** | Your Anthropic API key |
|
||||
|
||||
5. Click **Save**.
|
||||
|
||||
:::tip
|
||||
The URL field will suggest common provider endpoints as you type — just select the Anthropic URL from the dropdown.
|
||||
:::
|
||||
|
||||
---
|
||||
|
||||
## Step 3: Start Chatting!
|
||||
|
||||
That's it! Open WebUI automatically detects the Anthropic endpoint and fetches available models. Select a Claude model from the model dropdown and you're good to go.
|
||||
|
||||
Streaming, multi-turn conversations, and tool calling all work seamlessly.
|
||||
|
||||
:::info Optional: Filter Models
|
||||
If you want only specific models to appear, you can add model IDs to the **Model IDs (Filter)** allowlist in the connection settings. Common model IDs:
|
||||
|
||||
| Model | ID |
|
||||
|---|---|
|
||||
| Claude Sonnet 4.5 | `claude-sonnet-4-5-20250514` |
|
||||
| Claude Opus 4 | `claude-opus-4-0-20250514` |
|
||||
| Claude Sonnet 4 | `claude-sonnet-4-20250514` |
|
||||
| Claude Haiku 3.5 | `claude-3-5-haiku-20241022` |
|
||||
|
||||
Check [Anthropic's model documentation](https://docs.anthropic.com/en/docs/about-claude/models) for the latest available model IDs.
|
||||
:::
|
||||
|
||||
---
|
||||
|
||||
## Managing Your Connection
|
||||
|
||||
You can **enable or disable** your Anthropic connection at any time using the toggle switch next to the connection entry. This lets you temporarily deactivate a provider without deleting the connection or losing your configuration.
|
||||
|
||||
---
|
||||
|
||||
## Learn More
|
||||
|
||||
- **[OpenAI-Compatible Providers](/getting-started/quick-start/connect-a-provider/starting-with-openai-compatible)** — Protocol design philosophy and setup for other providers.
|
||||
- **[Pipe Functions](/features/extensibility/plugin/functions/pipe)** — Use Anthropic's native API with full feature support via a pipe function.
|
||||
@@ -40,7 +40,7 @@ When you add a connection, Open WebUI verifies it by calling the provider's `/mo
|
||||
|
||||
| Provider | `/models` works? | Action Needed |
|
||||
|---|---|---|
|
||||
| Anthropic | No — uses non-standard auth headers | Add model IDs manually to the whitelist |
|
||||
| Anthropic | Yes — built-in compatibility layer | Auto-detection works |
|
||||
| GitHub Models | No — uses non-standard path | Add model IDs manually to the whitelist |
|
||||
| Perplexity | No — endpoint doesn't exist | Add model IDs manually to the whitelist |
|
||||
| MiniMax | No — endpoint doesn't exist | Add model IDs manually to the whitelist |
|
||||
@@ -61,7 +61,7 @@ When you add a connection, Open WebUI verifies it by calling the provider's `/mo
|
||||
1. Open Open WebUI in your browser.
|
||||
2. Go to ⚙️ **Admin Settings** → **Connections** → **OpenAI**.
|
||||
3. Click ➕ **Add Connection**.
|
||||
4. Fill in the **URL** and **API Key** for your provider (see tabs below).
|
||||
4. Fill in the **URL** and **API Key** for your provider (see tabs below). The URL field will **suggest common provider endpoints** as you type.
|
||||
5. If your provider doesn't support `/models` auto-detection, add your model IDs to the **Model IDs (Filter)** allowlist.
|
||||
6. Click **Save**.
|
||||
|
||||
@@ -69,22 +69,26 @@ When you add a connection, Open WebUI verifies it by calling the provider's `/mo
|
||||
If running Open WebUI in Docker and your model server is on the host machine, replace `localhost` with `host.docker.internal` in the URL.
|
||||
:::
|
||||
|
||||
:::tip Enable/Disable Connections
|
||||
Each connection has a **toggle switch** that lets you enable or disable it without deleting the connection. This is useful for temporarily deactivating a provider while preserving its configuration.
|
||||
:::
|
||||
|
||||
### Cloud Providers
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="anthropic" label="Anthropic" default>
|
||||
|
||||
**Anthropic** (Claude) offers an OpenAI-compatible endpoint. Note that this is intended for testing and comparison — for production use with full Claude features (PDF processing, citations, extended thinking, prompt caching), Anthropic recommends their native API.
|
||||
:::tip
|
||||
See the dedicated **[Anthropic (Claude)](/getting-started/quick-start/connect-a-provider/starting-with-anthropic)** guide for a full step-by-step walkthrough.
|
||||
:::
|
||||
|
||||
**Anthropic** (Claude) offers an OpenAI-compatible endpoint. Open WebUI includes a built-in compatibility layer that automatically detects Anthropic URLs and handles model discovery — just plug in your API key and models are auto-detected. Note that this is intended for testing and comparison — for production use with full Claude features (PDF processing, citations, extended thinking, prompt caching), Anthropic recommends their native API.
|
||||
|
||||
| Setting | Value |
|
||||
|---|---|
|
||||
| **URL** | `https://api.anthropic.com/v1` |
|
||||
| **API Key** | Your Anthropic API key from [console.anthropic.com](https://console.anthropic.com/) |
|
||||
| **Model IDs** | **Required** — add manually (e.g., `claude-sonnet-4-5-20250514`, `claude-opus-4-0-20250514`) |
|
||||
|
||||
:::caution
|
||||
The `/models` endpoint **will fail** for Anthropic because their API uses non-standard authentication headers (`x-api-key` instead of `Bearer`). This is a cosmetic issue — **chat completions will work correctly**. Just add your model IDs to the allowlist manually.
|
||||
:::
|
||||
| **Model IDs** | Auto-detected — leave empty or filter to specific models |
|
||||
|
||||
</TabItem>
|
||||
<TabItem value="gemini" label="Google Gemini">
|
||||
|
||||
@@ -8,7 +8,7 @@ title: "OpenAI"
|
||||
|
||||
Open WebUI makes it easy to connect to **OpenAI** and **Azure OpenAI**. This guide will walk you through adding your API key, setting the correct endpoint, and selecting models — so you can start chatting right away.
|
||||
|
||||
For other providers that offer an OpenAI-compatible API (Anthropic, Google Gemini, Mistral, Groq, DeepSeek, and many more), see the **[OpenAI-Compatible Providers](/getting-started/quick-start/connect-a-provider/starting-with-openai-compatible)** guide.
|
||||
For other providers that offer an OpenAI-compatible API (Google Gemini, Mistral, Groq, DeepSeek, and many more), see the **[OpenAI-Compatible Providers](/getting-started/quick-start/connect-a-provider/starting-with-openai-compatible)** guide. For Anthropic's Claude models, see the dedicated **[Anthropic (Claude)](/getting-started/quick-start/connect-a-provider/starting-with-anthropic)** guide.
|
||||
|
||||
---
|
||||
|
||||
@@ -107,7 +107,7 @@ Simply choose GPT-4, o3-mini, or any compatible model offered by your provider.
|
||||
|
||||
That's it! Your OpenAI API connection is ready to use.
|
||||
|
||||
If you want to connect other providers (Anthropic, Google Gemini, Mistral, Groq, DeepSeek, etc.), see the **[OpenAI-Compatible Providers](/getting-started/quick-start/connect-a-provider/starting-with-openai-compatible)** guide.
|
||||
If you want to connect other providers, see the **[Anthropic (Claude)](/getting-started/quick-start/connect-a-provider/starting-with-anthropic)** guide or the **[OpenAI-Compatible Providers](/getting-started/quick-start/connect-a-provider/starting-with-openai-compatible)** guide for Google Gemini, Mistral, Groq, DeepSeek, and more.
|
||||
|
||||
If you run into issues or need additional support, visit our [help section](/troubleshooting).
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ To update your local Docker installation to the latest version, you can either u
|
||||
### Option 1: Using Watchtower (Recommended Fork)
|
||||
|
||||
:::info Deprecation Notice
|
||||
The original `containrrr/watchtower` is **no longer maintained** and may fail with newer Docker versions. We recommend using the `nickfedor/watchtower` fork.
|
||||
The original `containrrr/watchtower` is **no longer maintained** and may fail with newer Docker versions. We recommend using the `nicholas-fedor/watchtower` fork.
|
||||
:::
|
||||
|
||||
:::warning Multi-Worker Environments
|
||||
@@ -17,7 +17,7 @@ If you run Open WebUI with `UVICORN_WORKERS > 1` (e.g., in a production environm
|
||||
3. Stop and restart the container with your desired number of workers.
|
||||
:::
|
||||
|
||||
With [Watchtower](https://github.com/nickfedor/watchtower), you can automate the update process:
|
||||
With [Watchtower](https://github.com/nicholas-fedor/watchtower), you can automate the update process:
|
||||
|
||||
```bash
|
||||
docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock nickfedor/watchtower --run-once open-webui
|
||||
|
||||
@@ -239,9 +239,9 @@ Automated updates can save time but require careful consideration of the trade-o
|
||||
The original `containrrr/watchtower` is **no longer maintained** and **does not work with Docker 29+**. The community has created maintained forks that resolve these issues.
|
||||
:::
|
||||
|
||||
The original Watchtower project hasn't received updates in over two years and fails with Docker version 29.0.0 or newer due to API version incompatibility. Two maintained forks are now available: nickfedor/watchtower and Marrrrrrrrry/watchtower, both compatible with Docker 29+.
|
||||
The original Watchtower project hasn't received updates in over two years and fails with Docker version 29.0.0 or newer due to API version incompatibility. Two maintained forks are now available: nicholas-fedor/watchtower and Marrrrrrrrry/watchtower, both compatible with Docker 29+.
|
||||
|
||||
**Recommended: nickfedor/watchtower fork**
|
||||
**Recommended: nicholas-fedor/watchtower fork**
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="one-time" label="One-Time Update" default>
|
||||
@@ -453,7 +453,7 @@ For complete Diun documentation, visit https://crazymax.dev/diun/
|
||||
| **Best For** | Set-and-forget homelabs | Visual monitoring + control | Notification-only workflows |
|
||||
|
||||
:::tip Recommendation
|
||||
- **For homelabs/personal use:** nickfedor/watchtower (automated)
|
||||
- **For homelabs/personal use:** nicholas-fedor/watchtower (automated)
|
||||
- **For managed environments:** WUD (visual + manual control)
|
||||
- **For production/critical systems:** Diun (notifications only) + manual updates
|
||||
:::
|
||||
|
||||
@@ -151,7 +151,7 @@ Always use a separate data volume (e.g., `-v open-webui-dev:/app/backend/data`)
|
||||
To update Open WebUI container easily, follow these steps:
|
||||
|
||||
#### Manual Update
|
||||
Use [Watchtower](https://github.com/nickfedor/watchtower) to update your Docker container manually:
|
||||
Use [Watchtower](https://github.com/nicholas-fedor/watchtower) to update your Docker container manually:
|
||||
```bash
|
||||
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock nickfedor/watchtower --run-once open-webui
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user