This commit is contained in:
DrMelone
2026-02-19 19:47:04 +01:00
parent 02f827cbae
commit 6b34253f9e
3 changed files with 44 additions and 64 deletions

View File

@@ -1517,9 +1517,29 @@ We strongly recommend using an [OpenAPI tool server](/features/extensibility/plu
:::
Keep in mind that as pip is used in the same process as Open WebUI, the UI will be completely unresponsive during the installation.
:::danger Production / Multi-Worker Deployments
No measures are taken to handle package conflicts with Open WebUI's requirements. That means that specifying requirements can break Open WebUI if you're not careful. You might be able to work around this by specifying `open-webui` itself as a requirement.
**Do not rely on runtime pip installation in production environments.** When running with `UVICORN_WORKERS > 1` or multiple replicas, each worker/replica attempts to install packages independently on startup. This causes **race conditions** where concurrent pip processes crash with `AssertionError` because pip's internal locking detects the simultaneous installs.
**The recommended approach for production is to pre-install all required packages at image build time** using a custom Dockerfile:
```dockerfile
FROM ghcr.io/open-webui/open-webui:main
RUN pip install --no-cache-dir python-docx requests beautifulsoup4
```
Runtime installation is only suitable for **single-worker development or homelab environments** where you're experimenting with new functions and tools. For any deployment serving multiple users, bake your dependencies into the container image.
:::
:::note
Keep in mind that pip runs in the same process as Open WebUI, so the **UI will be completely unresponsive** during installation.
No measures are taken to handle package conflicts with Open WebUI's own dependencies. Specifying requirements can break Open WebUI if you're not careful. You might be able to work around this by specifying `open-webui` itself as a requirement.
:::
<details>
<summary>Example</summary>

View File

@@ -81,68 +81,6 @@ Access detailed API documentation for different services provided by Open WebUI:
return response.json()
```
### 🔄 OpenResponses
- **Endpoint**: `POST /api/responses`
- **Description**: Proxies requests to the [OpenResponses API](https://openresponses.org). Automatically routes to the correct upstream backend based on the `model` field, including Azure OpenAI deployments. Supports both streaming (SSE) and non-streaming responses.
- **Curl Example**:
```bash
curl -X POST http://localhost:3000/api/responses \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4.1",
"input": "Explain quantum entanglement in simple terms."
}'
```
- **Python Example**:
```python
import requests
def create_response(token, model, input_text):
url = 'http://localhost:3000/api/responses'
headers = {
'Authorization': f'Bearer {token}',
'Content-Type': 'application/json'
}
data = {
"model": model,
"input": input_text
}
response = requests.post(url, headers=headers, json=data)
return response.json()
```
- **Streaming Example**:
```python
import requests
def stream_response(token, model, input_text):
url = 'http://localhost:3000/api/responses'
headers = {
'Authorization': f'Bearer {token}',
'Content-Type': 'application/json'
}
data = {
"model": model,
"input": input_text,
"stream": True
}
response = requests.post(url, headers=headers, json=data, stream=True)
for line in response.iter_lines():
if line:
print(line.decode('utf-8'))
```
:::info
The Responses API endpoint supports the same model-based routing as Chat Completions. If you have multiple OpenAI-compatible backends configured, the request is automatically routed based on which backend hosts the specified model. Azure OpenAI deployments are also supported with appropriate API version handling.
:::
### 🔧 Filter and Function Behavior with API Requests
When using the API endpoints directly, filters (Functions) behave differently than when requests come from the web interface.

View File

@@ -216,6 +216,28 @@ For example, with `DATABASE_POOL_SIZE=15`, `DATABASE_POOL_MAX_OVERFLOW=20`, 3 re
See [DATABASE_POOL_SIZE](/reference/env-configuration#database_pool_size) for details.
### 9. Function/Tool Dependency Installation Crashes
**Symptoms:**
- Workers crash with `AssertionError` on startup or when a function/tool is first loaded.
- Logs show pip locking errors or multiple pip processes competing.
**Cause:**
When a function or tool specifies `requirements` in its frontmatter, Open WebUI runs `pip install` at runtime. With multiple workers or replicas, each process attempts the installation independently, causing pip's internal lock to detect the conflict and crash.
**Solution:**
**Do not rely on runtime pip installation in production.** Pre-install all required packages at image build time:
```dockerfile
FROM ghcr.io/open-webui/open-webui:main
RUN pip install --no-cache-dir python-docx requests beautifulsoup4
```
Runtime `requirements` installation is only appropriate for single-worker development or homelab environments.
For more details, see the [External Packages](/features/extensibility/plugin/tools/development#external-packages) section of the Tools documentation.
---
## Deployment Best Practices