mirror of
https://github.com/open-webui/docs.git
synced 2026-01-02 17:59:41 +07:00
117 lines
6.4 KiB
Plaintext
117 lines
6.4 KiB
Plaintext
---
|
||
sidebar_position: 2
|
||
title: "Tools"
|
||
---
|
||
|
||
# ⚙️ What are Tools?
|
||
|
||
Tools are the various ways you can extend an LLM's capabilities beyond simple text generation. When enabled, they allow your chatbot to do amazing things — like search the web, scrape data, generate images, talk back using AI voices, and more.
|
||
|
||
Because there are several ways to integrate "Tools" in Open WebUI, it's important to understand which type you are using.
|
||
|
||
---
|
||
|
||
## 🧩 Tooling Taxonomy: Which "Tool" are you using?
|
||
|
||
Users often encounter the term "Tools" in different contexts. Here is how to distinguish them:
|
||
|
||
| Type | Location in UI | Best For... | Source |
|
||
| :--- | :--- | :--- | :--- |
|
||
| **Native Features** | Admin/Settings | Core platform functionality | Built-in to Open WebUI |
|
||
| **Workspace Tools** | `Workspace > Tools` | User-created or community Python scripts | [Community Library](https://openwebui.com/search) |
|
||
| **Native MCP (HTTP)** | `Settings > Connections` | Standard MCP servers reachable via HTTP/SSE | External MCP Servers |
|
||
| **MCP via Proxy (MCPO)** | `Settings > Connections` | Local stdio-based MCP servers (e.g., Claude Desktop tools) | [MCPO Adapter](https://github.com/open-webui/mcpo) |
|
||
| **OpenAPI Servers** | `Settings > Connections` | Standard REST/OpenAPI web services | External Web APIs |
|
||
|
||
### 1. Native Features (Built-in)
|
||
These are deeply integrated into Open WebUI and generally don't require external scripts.
|
||
- **Web Search**: Integrated via engines like SearXNG, Google, or Tavily.
|
||
- **Image Generation**: Integrated with DALL-E, ComfyUI, or Automatic1111.
|
||
- **RAG (Knowledge)**: The ability to query uploaded documents (`#`).
|
||
|
||
### 2. Workspace Tools (Custom Plugins)
|
||
These are **Python scripts** that run directly within the Open WebUI environment.
|
||
- **Capability**: Can do anything Python can do (web scraping, complex math, API calls).
|
||
- **Access**: Managed via the `Workspace` menu.
|
||
- **Safety**: Always review code before importing, as these run on your server.
|
||
- **⚠️ Security Warning**: Normal or untrusted users should **not** be given permission to access the Workspace Tools section. This access allows a user to upload and execute arbitrary Python code on your server, which could lead to a full system compromise.
|
||
|
||
### 3. MCP (Model Context Protocol) 🔌
|
||
MCP is an open standard that allows LLMs to interact with external data and tools.
|
||
- **Native HTTP MCP**: Open WebUI can connect directly to any MCP server that exposes an HTTP/SSE endpoint.
|
||
- **MCPO (Proxy)**: Most community MCP servers use `stdio` (local command line). To use these in Open WebUI, you use the [**MCPO Proxy**](../../plugin/tools/openapi-servers/mcp.mdx) to bridge the connection.
|
||
|
||
### 4. OpenAPI / Function Calling Servers
|
||
Generic web servers that provide an OpenAPI (`.json` or `.yaml`) specification. Open WebUI can ingest these specs and treat every endpoint as a tool.
|
||
|
||
---
|
||
|
||
## 📦 How to Install & Manage Workspace Tools
|
||
|
||
Workspace Tools are the most common way to extend your instance with community features.
|
||
|
||
1. Go to [Community Tool Library](https://openwebui.com/search)
|
||
2. Choose a Tool, then click the **Get** button.
|
||
3. Enter your Open WebUI instance’s URL (e.g. `http://localhost:3000`).
|
||
4. Click **Import to WebUI**.
|
||
|
||
:::warning Safety Tip
|
||
Never import a Tool you don’t recognize or trust. These are Python scripts and might run unsafe code on your host system. **Crucially, ensure you only grant "Tool" permissions to trusted users**, as the ability to create or import tools is equivalent to the ability to run arbitrary code on the server.
|
||
:::
|
||
|
||
---
|
||
|
||
## 🔧 How to Use Tools in Chat
|
||
|
||
Once installed or connected, here’s how to enable them for your conversations:
|
||
|
||
### Option 1: Enable on-the-fly (Specific Chat)
|
||
While chatting, click the **➕ (plus)** icon in the input area. You’ll see a list of available Tools — you can enable them specifically for that session.
|
||
|
||
### Option 2: Enable by Default (Global/Model Level)
|
||
1. Go to **Workspace ➡️ Models**.
|
||
2. Choose the model you’re using and click the ✏️ edit icon.
|
||
3. Scroll to the **Tools** section.
|
||
4. ✅ Check the Tools you want this model to always have access to by default.
|
||
5. Click **Save**.
|
||
|
||
You can also let your LLM auto-select the right Tools using the [**AutoTool Filter**](https://openwebui.com/f/hub/autotool_filter/).
|
||
|
||
---
|
||
|
||
## 🧠 Choosing How Tools Are Used: Default vs Native
|
||
|
||
Once Tools are enabled, Open WebUI gives you two different ways to let your LLM interact with them. You can switch this via the chat settings:
|
||
|
||

|
||
|
||
1. Open a chat with your model.
|
||
2. Click ⚙️ **Chat Controls > Advanced Params**.
|
||
3. Look for the **Function Calling** setting and switch between:
|
||
|
||
### 🟡 Default Mode (Prompt-based)
|
||
Here, your LLM doesn’t need to natively support function calling. We guide the model using a smart tool-selection prompt template to select and use a Tool.
|
||
- ✅ Works with **practically any model** (including smaller local models).
|
||
- 💡 **Admin Note**: You can also toggle the default mode for each specific model in the **Admin Panel > Settings > Models > Advanced Parameters**.
|
||
|
||
- ❗ Not as reliable as Native Mode when chaining multiple complex tools.
|
||
|
||
### 🟢 Native Mode (Built-in Function Calling)
|
||
If your model supports native function calling (like GPT-4o, Gemini, Claude, or GPT-5), use this for a faster, more accurate experience where the LLM decides exactly when and how to call tools.
|
||
- ✅ Fast, accurate, and can chain multiple tools in one response.
|
||
- ❗ Requires a model that explicitly supports tool-calling schemas.
|
||
|
||
| Mode | Who it’s for | Pros | Cons |
|
||
|----------|----------------------------------|-----------------------------------------|--------------------------------------|
|
||
| **Default** | Practically any model (basic/local) | Broad compatibility, safer, flexible | May be less accurate or slower |
|
||
| **Native** | GPT-4o, Gemini, Claude, GPT-5, etc. | Fast, smart, excellent tool chaining | Needs proper function call support |
|
||
|
||
---
|
||
|
||
## 🚀 Summary & Next Steps
|
||
|
||
Tools bring your AI to life by giving it hands to interact with the world.
|
||
- **Browse Tools**: [openwebui.com/search](https://openwebui.com/search)
|
||
- **Advanced Setup**: Learn more about [MCP Support](./openapi-servers/mcp.mdx)
|
||
- **Development**: [Writing your own Custom Toolkits](./development.mdx)
|