Docs: start dify docs migration

This commit is contained in:
AllenWriter
2025-03-17 14:42:46 +08:00
parent e8cd98eee8
commit 7ec36737ba
14 changed files with 1985 additions and 2 deletions

View File

@@ -22,9 +22,11 @@
"tab": "Documentation",
"groups": [
{
"group": "Introduction",
"group": "Getting Started",
"pages": [
"introduction"
"en-us/introduction",
"en-us/features-and-specifications",
"en-us/model-providers"
]
},
{

View File

@@ -0,0 +1,17 @@
---
description: >-
For those already familiar with LLM application tech stacks, this document
serves as a shortcut to understand Dify's unique advantages
---
# Features and Specifications
We adopt transparent policies around product specifications to ensure decisions are made based on complete understanding. Such transparency not only benefits your technical selection, but also promotes deeper comprehension within the community for active contributions.
### Project Basics
<table data-header-hidden><thead><tr><th width="341"></th><th></th></tr></thead><tbody><tr><td>Established</td><td>March 2023</td></tr><tr><td>Open Source License</td><td><a href="../../policies/open-source.md">Apache License 2.0 with commercial licensing</a></td></tr><tr><td>Official R&D Team</td><td>Over 15 full-time employees</td></tr><tr><td>Community Contributors</td><td>Over <a href="https://ossinsight.io/analyze/langgenius/dify#overview">290</a> people(As of Q2 2024)</td></tr><tr><td>Backend Technology</td><td>Python/Flask/PostgreSQL</td></tr><tr><td>Frontend Technology</td><td>Next.js</td></tr><tr><td>Codebase Size</td><td>Over 130,000 lines</td></tr><tr><td>Release Frequency</td><td>Average once per week</td></tr></tbody></table>
### Technical Features
<table data-header-hidden><thead><tr><th width="240"></th><th></th></tr></thead><tbody><tr><td>LLM Inference Engines</td><td>Dify Runtime (LangChain removed since v0.4)</td></tr><tr><td>Commercial Models Supported</td><td><strong>10+</strong>, including OpenAI and Anthropic<br>Onboard new mainstream models within 48 hours</td></tr><tr><td>MaaS Vendor Supported</td><td><strong>7</strong>, Hugging Face, Replicate, AWS Bedrock, NVIDIA, GroqCloud, together.ai,, OpenRouter</td></tr><tr><td>Local Model Inference Runtimes Supported</td><td><strong>6</strong>, Xoribits (recommended), OpenLLM, LocalAI, ChatGLM,Ollama, NVIDIA TIS </td></tr><tr><td>OpenAI Interface Standard Model Integration Supported</td><td><strong>∞</strong></td></tr><tr><td>Multimodal Capabilities</td><td><p>ASR Models</p><p>Rich-text models up to GPT-4o specs</p></td></tr><tr><td>Built-in App Types</td><td>Text generation, Chatbot, Agent, Workflow, Chatflow</td></tr><tr><td>Prompt-as-a-Service Orchestration</td><td><p>Visual orchestration interface widely praised, modify Prompts and preview effects in one place.<br></p><p><strong>Orchestration Modes</strong></p><ul><li>Simple orchestration</li><li>Assistant orchestration </li><li>Flow orchestration </li></ul><p><strong>Prompt Variable Types</strong></p><ul><li>String</li><li>Radio enum</li><li>External API</li><li>File (Q3 2024)</li></ul></td></tr><tr><td>Agentic Workflow Features</td><td><p>Industry-leading visual workflow orchestration interface, live-editing node debugging, modular DSL, and native code runtime, designed for building more complex, reliable, and stable LLM applications.</p><p><br>Supported Nodes</p><ul><li>LLM</li><li>Knowledge Retrieval</li><li>Question Classifier</li><li>IF/ELSE</li><li>CODE</li><li>Template</li><li>HTTP Request</li><li>Tool</li></ul></td></tr><tr><td>RAG Features</td><td><p>Industry-first visual knowledge base management interface, supporting snippet previews and recall testing.</p><p><strong>Indexing Methods</strong></p><ul><li>Keywords</li><li>Text vectors</li><li>LLM-assisted question-snippet model</li></ul><p><strong>Retrieval Methods</strong></p><ul><li>Keywords</li><li>Text similarity matching</li><li>Hybrid Search</li><li>N choose 1Legacy</li><li>Multi-path retrieval</li></ul><p><strong>Recall Optimization</strong></p><ul><li>Rerank models</li></ul></td></tr><tr><td>ETL Capabilities</td><td><p>Automated cleaning for TXT, Markdown, PDF, HTML, DOC, CSV formats. Unstructured service enables maximum support.</p><p>Sync Notion docs as knowledge bases.<br>Sync Webpages as knowledge bases.</p></td></tr><tr><td>Vector Databases Supported</td><td>Qdrant (recommended), WeaviateZilliz/Milvus, Pgvector, Pgvector-rsChroma, OpenSearch, TiDB, Tencent Vector, Oracle, Relyt, Analyticdb, Couchbase</td></tr><tr><td>Agent Technologies</td><td><p>ReAct, Function Call.<br></p><p><strong>Tooling Support</strong></p><ul><li>Invoke OpenAI Plugin standard tools </li><li>Directly load OpenAPI Specification APIs as tools</li></ul><p><strong>Built-in Tools</strong></p><ul><li>40+ tools(As of Q2 2024)</li></ul></td></tr><tr><td>Logging</td><td>Supported, annotations based on logs</td></tr><tr><td>Annotation Reply</td><td>Based on human-annotated Q&As, used for similarity-based replies. Exportable as data format for model fine-tuning.</td></tr><tr><td>Content Moderation</td><td>OpenAI Moderation or external APIs</td></tr><tr><td>Team Collaboration</td><td>Workspaces, multi-member management</td></tr><tr><td>API Specs</td><td>RESTful, most features covered</td></tr><tr><td>Deployment Methods</td><td>Docker, Helm</td></tr></tbody></table>

View File

@@ -0,0 +1,33 @@
# Dify Cloud
{% hint style="info" %}
Note: Dify is currently in the Beta testing phase. If there are inconsistencies between the documentation and the product, please refer to the actual product experience.
{% endhint %}
Dify can be used [out-of-box ](https://cloud.dify.ai/apps)as a cloud service by anyone. Explore the flexible [Plans and Pricing](https://dify.ai/pricing) and select the plan that best suits your needs and requirements.
Get started now with the [Sandbox plan](http://cloud.dify.ai), which includes a free trial of 200 OpenAI calls, no credit card required. To use the Sandbox plan of the cloud version, you will need a GitHub or Google account, as well as an OpenAI API key. Here's how you can get started:
1. Sign up to [Dify Cloud](https://cloud.dify.ai) and create a new Workspace or join an existing one.
2. Configure your model provider or use our hosted model provider.
3. You can [create an application](../guides/application-orchestrate/creating-an-application.md) now!
### FAQs
**Q: How is my data handled and stored when using Dify Cloud?**
A: When you use Dify Cloud, your user data is securely stored on AWS servers located in the US-East region. This includes both the data you actively input and any generated data from your applications. We prioritize your data's security and integrity, ensuring that it is managed with the highest standards of cloud storage solutions.
**Q: What measures are in place to protect my API keys and other sensitive information?**
A: At Dify, we understand the importance of protecting your API keys and other secrets. These are encrypted at rest, which means Dify cannot view them and that only you, the rightful owner, have access to your secrets.
**Q: Can you explain how application data is anonymized in Dify Cloud?**
A: In Dify Cloud, we anonymize application data to ensure privacy and reduce encryption and decryption overheads. This means that the data used by applications is not directly associated with identifiable user accounts. By anonymizing the data, we enhance privacy while maintaining the performance of our cloud services.
**Q: What is the process for deleting my account and all associated data from Dify Cloud?**
A: If you decide to delete your account and remove all associated data from Dify Cloud, you can simply send a request to our support team at support@dify.ai. We are committed to respecting your privacy and data rights, and upon request, we will erase all your data from our systems, adhering to data protection regulations.

View File

@@ -0,0 +1,110 @@
# Dify Premium on AWS
Dify Premium is our AWS AMI offering that allows custom branding and is one-click deployable to your AWS VPC as an EC2. Head to [AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) to subscribe. It's useful in a couple of scenarios:
* You're looking to create one or a few applications as a small/medium business and you care about data residency.
* You are interested in [Dify Cloud](cloud.md), but your use case require more resource than supported by the [plans](https://dify.ai/pricing).
* You'd like to run a POC before adopting Dify Enterprise within your organization.
## Setup
If this is your first time accessing Dify, enter the Admin initialization password (set to your EC2's instance ID) to start the set up process.
After the AMI is deployed, access Dify via the instance's public IP found in th EC2 console (HTTP port 80 is used by default).
## Upgrading
In the EC2 instance, run the following commands:
```
git clone https://github.com/langgenius/dify.git /tmp/dify
mv -f /tmp/dify/docker/* /dify/
rm -rf /tmp/dify
docker-compose down
docker-compose pull
docker-compose -f docker-compose.yaml -f docker-compose.override.yaml up -d
```
> To upgrade to version v1.0.0, please refer to [Migrating Community Edition to v1.0.0](https://docs.dify.ai/development/migration/migrate-to-v1).
<details>
<summary>Upgrading Community Edition to v1.0.0</summary>
The upgrade process involves the following steps:
1. Backup your data
2. Migrate plugins
3. Upgrade the main dify project
### 1. Backup Data
1.1 Execute the `cd` command to navigate to your Dify project directory and create a backup branch.
1.2 Run the following command to back up your docker-compose YAML file (optional).
```bash
cd docker
cp docker-compose.yaml docker-compose.yaml.$(date +%s).bak
```
1.3 Run the command to stop docker services, then execute the backup data command in the Docker directory.
```bash
docker compose down
tar -cvf volumes-$(date +%s).tgz volumes
```
### 2. Upgrade the Version
`v1.0.0` supports deployment via Docker Compose. Navigate to your Dify project path and run the following commands to upgrade to the Dify version:
```bash
git checkout 1.0.0 # Switch to the 1.0.0 branch
cd docker
docker compose -f docker-compose.yaml up -d
```
### 3. Migrate Tools to Plugins
The purpose of this step is to automatically migrate the tools and model vendors previously used in the Community Edition and install them into the new plugin environment.
1. Run the docker ps command to check the docker-api container ID.
Example:
```bash
docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
417241cd**** nginx:latest "sh -c 'cp /docker-e…" 3 hours ago Up 3 hours 0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp docker-nginx-1
f84aa773**** langgenius/dify-api:1.0.0 "/bin/bash /entrypoi…" 3 hours ago Up 3 hours 5001/tcp docker-worker-1
a3cb19c2**** langgenius/dify-api:1.0.0 "/bin/bash /entrypoi…" 3 hours ago Up 3 hours 5001/tcp docker-api-1
```
Run the command `docker exec -it a3cb19c2**** bash` to enter the container terminal, and then run:
```bash
poetry run flask extract-plugins --workers=20
```
> If an error occurs, it is recommended to first install the `poetry` environment on the server as per the prerequisites. If the terminal asks for input after running the command, press **“Enter”** to skip the input.
This command will extract all models and tools currently in use in the environment. The workers parameter controls the number of parallel processes used during extraction and can be adjusted as needed. After the command runs, it will generate a `plugins.jsonl` file containing plugin information for all workspaces in the current Dify instance.
Ensure your network can access the public internet and support access to: `https://marketplace.dify.ai`. Continue running the following command in the `docker-api-1` container:
```bash
poetry run flask install-plugins --workers=2
```
This command will download and install all necessary plugins into the latest Community Edition. When the terminal shows `Install plugins completed.`, the migration is complete.
</details>
## Customizing
Just like self-hosted deploy, you may modify the environment variables under `.env` in your EC2 instance as you see fit. Then, restart Dify with:
```
docker-compose down
docker-compose -f docker-compose.yaml -f docker-compose.override.yaml up -d
```

View File

@@ -0,0 +1,12 @@
# Dify Community
Dify, an open-source project on [GitHub](https://github.com/langgenius/dify), can be self-hosted using either of these methods:
1. [Docker Compose Deployment](https://docs.dify.ai/getting-started/install-self-hosted/docker-compose)
2. [Local Source Code Start](https://docs.dify.ai/getting-started/install-self-hosted/local-source-code)
### Contributing
To maintain code quality, we require all code contributions - even from those with direct commit access - to be submitted as pull requests. These must be reviewed and approved by the core development team before merging.
We welcome contributions from everyone! If you're interested in helping, please check our [Contribution Guide](https://github.com/langgenius/dify/blob/main/CONTRIBUTING.md) for more information on getting started.

View File

@@ -0,0 +1,57 @@
# Deploy with aaPanel
## Prerequisites
> Before installing Dify, make sure your machine meets the following minimum system requirements:
>
> * CPU >= 2 Core
> * RAM >= 4 GiB
| Operating System | Software | Explanation |
| -------------------------- | ------------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Linux platforms | <p>aaPanel 7.0.11 or later</p> | Please refer to the [aaPanel installation guide](https://www.aapanel.com/new/download.html#install) for more information on how to install aaPanel. |
## Deployment
1. Log in to aaPanel and click `Docker` in the menu bar
2. The first time you will be prompted to install the `Docker` and `Docker Compose` services, click Install Now. If it is already installed, please ignore it.
3. After the installation is complete, find `Dify` in `One-Click Install` and click `install`
4. configure basic information such as the domain name, ports to complete the installation
> \[!IMPORTANT]
>
> The domain name is optional, if the domain name is filled, it can be managed through [Website]--> [Proxy Project], and you do not need to check [Allow external access] after filling in the domain name, otherwise you need to check it before you can access it through the port
5. After installation, enter the domain name or IP+ port set in the previous step in the browser to access.
- Name: application name, default `Dify-characters`
- Version selection: default `latest`
- Domain name: If you need to access directly through the domain name, please configure the domain name here and resolve the domain name to the server
- Allow external access: If you need direct access through `IP+Port`, please check. If you have set up a domain name, please do not check here.
- Port: Default `8088`, can be modified by yourself
6. After submission, the panel will automatically initialize the application, which will take about `1-3` minutes. It can be accessed after the initialization is completed.
### Access Dify
Access administrator initialization page to set up the admin account:
```bash
# If you have set domain
http://yourdomain/install
# If you choose to access through `IP+Port`
http://your_server_ip:8088/install
```
Dify web interface address:
```bash
# If you have set domain
http://yourdomain/
# If you choose to access through `IP+Port`
http://your_server_ip:8088/
```

View File

@@ -0,0 +1,149 @@
# Deploy with Docker Compose
## Prerequisites
> Before installing Dify, make sure your machine meets the following minimum system requirements:
>
> * CPU >= 2 Core
> * RAM >= 4 GiB
| Operating System | Software | Explanation |
| -------------------------- | ------------------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| macOS 10.14 or later | Docker Desktop | Set the Docker virtual machine (VM) to use a minimum of 2 virtual CPUs (vCPUs) and 8 GB of initial memory. Otherwise, the installation may fail. For more information, please refer to the [Docker Desktop installation guide for Mac](https://docs.docker.com/desktop/mac/install/). |
| Linux platforms | <p>Docker 19.03 or later<br>Docker Compose 1.28 or later</p> | Please refer to the [Docker installation guide](https://docs.docker.com/engine/install/) and [the Docker Compose installation guide](https://docs.docker.com/compose/install/) for more information on how to install Docker and Docker Compose, respectively. |
| Windows with WSL 2 enabled | Docker Desktop | We recommend storing the source code and other data that is bound to Linux containers in the Linux file system rather than the Windows file system. For more information, please refer to the [Docker Desktop installation guide for using the WSL 2 backend on Windows.](https://docs.docker.com/desktop/windows/install/#wsl-2-backend) |
> \[!IMPORTANT]
>
> Dify 0.6.12 has introduced significant enhancements to Docker Compose deployment, designed to improve your setup and update experience. For more information, read the [README.md](https://github.com/langgenius/dify/blob/main/docker/README.md).
### Clone Dify
Clone the Dify source code to your local machine:
```bash
# Assuming current latest version is 0.15.3
git clone https://github.com/langgenius/dify.git --branch 0.15.3
```
### Starting Dify
1. Navigate to the Docker directory in the Dify source code
```bash
cd dify/docker
```
2. Copy the environment configuration file
```bash
cp .env.example .env
```
3. Start the Docker containers
Choose the appropriate command to start the containers based on the Docker Compose version on your system. You can use the `$ docker compose version` command to check the version, and refer to the [Docker documentation](https://docs.docker.com/compose/install/) for more information:
* If you have Docker Compose V2, use the following command:
```bash
docker compose up -d
```
* If you have Docker Compose V1, use the following command:
```bash
docker-compose up -d
```
After executing the command, you should see output similar to the following, showing the status and port mappings of all containers:
```bash
[+] Running 11/11
✔ Network docker_ssrf_proxy_network Created 0.1s
✔ Network docker_default Created 0.0s
✔ Container docker-redis-1 Started 2.4s
✔ Container docker-ssrf_proxy-1 Started 2.8s
✔ Container docker-sandbox-1 Started 2.7s
✔ Container docker-web-1 Started 2.7s
✔ Container docker-weaviate-1 Started 2.4s
✔ Container docker-db-1 Started 2.7s
✔ Container docker-api-1 Started 6.5s
✔ Container docker-worker-1 Started 6.4s
✔ Container docker-nginx-1 Started 7.1s
```
Finally, check if all containers are running successfully:
```bash
docker compose ps
```
This includes 3 core services: `api / worker / web`, and 6 dependent components: `weaviate / db / redis / nginx / ssrf_proxy / sandbox` .
```bash
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
docker-api-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" api About a minute ago Up About a minute 5001/tcp
docker-db-1 postgres:15-alpine "docker-entrypoint.s…" db About a minute ago Up About a minute (healthy) 5432/tcp
docker-nginx-1 nginx:latest "sh -c 'cp /docker-e…" nginx About a minute ago Up About a minute 0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp
docker-redis-1 redis:6-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp
docker-sandbox-1 langgenius/dify-sandbox:0.2.1 "/main" sandbox About a minute ago Up About a minute
docker-ssrf_proxy-1 ubuntu/squid:latest "sh -c 'cp /docker-e…" ssrf_proxy About a minute ago Up About a minute 3128/tcp
docker-weaviate-1 semitechnologies/weaviate:1.19.0 "/bin/weaviate --hos…" weaviate About a minute ago Up About a minute
docker-web-1 langgenius/dify-web:0.6.13 "/bin/sh ./entrypoin…" web About a minute ago Up About a minute 3000/tcp
docker-worker-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" worker About a minute ago Up About a minute 5001/tcp
```
With these steps, you should be able to install Dify successfully.
### Upgrade Dify
Enter the docker directory of the dify source code and execute the following commands:
```bash
cd dify/docker
docker compose down
git pull origin main
docker compose pull
docker compose up -d
```
#### Sync Environment Variable Configuration (Important)
* If the `.env.example` file has been updated, be sure to modify your local `.env` file accordingly.
* Check and modify the configuration items in the `.env` file as needed to ensure they match your actual environment. You may need to add any new variables from `.env.example` to your `.env` file, and update any values that have changed.
### Access Dify
Access administrator initialization page to set up the admin account:
```bash
# Local environment
http://localhost/install
# Server environment
http://your_server_ip/install
```
Dify web interface address:
```bash
# Local environment
http://localhost
# Server environment
http://your_server_ip
```
### Customize Dify
Edit the environment variable values in your `.env` file directly. Then, restart Dify with:
```
docker compose down
docker compose up -d
```
The full set of annotated environment variables along can be found under docker/.env.example.
### Read More
If you have any questions, please refer to [FAQs](faqs.md).

View File

@@ -0,0 +1,630 @@
# Environments
### Common Variables
#### CONSOLE_API_URL
The backend URL for the console API. This is used to construct the authorization callback. If left empty, it defaults to the same domain as the application. Example: `https://api.console.dify.ai`
#### CONSOLE_WEB_URL
The front-end URL of the console web interface. This is used to construct front-end addresses and for CORS configuration. If left empty, it defaults to the same domain as the application. Example: `https://console.dify.ai`
#### SERVICE_API_URL
The Service API URL, used to display Service API Base URL in the front-end. If left empty, it defaults to the same domain as the application. Example: `https://api.dify.ai`
#### APP_API_URL
The WebApp API backend URL, used to specify the backend URL for the front-end API. If left empty, it defaults to the same domain as the application. Example: `https://app.dify.ai`
#### APP_WEB_URL
The WebApp URL, used to display File preview or download Url to the front-end or as Multi-model inputs; If left empty, it defaults to the same domain as the application. Example: `https://udify.app/`
#### FILES_URL
The prefix for file preview or download URLs, used to display these URLs in the front-end and provide them as input for multi-modal models. To prevent forgery, image preview URLs are signed and expire after 5 minutes.
***
### Server
#### MODE
Startup mode: This is only available when launched using docker. It is not applicable when running from source code.
- api
Start API Server.
- worker
Start asynchronous queue worker.
#### DEBUG
Debug mode: Disabled by default. It's recommended to enable this setting during local development to prevent issues caused by monkey patching.
#### FLASK_DEBUG
Flask debug mode: When enabled, it outputs trace information in the API responses, facilitating easier debugging.
#### SECRET_KEY
A secret key used for securely signing session cookies and encrypting sensitive information in the database.
This variable must be set before the first launch.
Run `openssl rand -base64 42` to generate a strong key for it.
#### DEPLOY_ENV
Deployment environment:
- PRODUCTION (default)
Production environment.
- TESTING
Testing environment. There will be a distinct color label on the front-end page, indicating that this environment is a testing environment.
#### LOG_LEVEL
The log output level. Default is INFO. For production environments, it's recommended to set this to ERROR.
#### MIGRATION_ENABLED
When set to true, database migrations are automatically executed on container startup. This is only available when launched using docker and does not apply when running from source code.
For source code launches, you need to manually run `flask db upgrade` in the api directory.
#### CHECK_UPDATE_URL
Controls the version checking policy. If set to false, the system will not call `https://updates.dify.ai` to check for updates.
Currently, the version check interface based on CloudFlare Worker is not directly accessible in China. Setting this variable to an empty value will disable this API call.
#### TEXT\_GENERATION\_TIMEOUT\_MS
Default value: 60000 (milliseconds). Specifies the timeout for text generation and workflow processes. This setting prevents system-wide service disruptions caused by individual processes exceeding their allocated time.
#### OPENAI_API_BASE
Used to change the OpenAI base address, default is [https://api.openai.com/v1](https://api.openai.com/v1).
When OpenAI cannot be accessed in China, replace it with a domestic mirror address, or when a local model provides OpenAI compatible API, it can be replaced.
#### Container Startup Related Configuration
Only effective when starting with docker image or docker-compose.
- DIFY_BIND_ADDRESS
API service binding address, default: 0.0.0.0, i.e., all addresses can be accessed.
- DIFY_PORT
API service binding port number, default to 5001.
- SERVER_WORKER_AMOUNT
The number of API server workers, i.e., the number of gevent workers. Formula: `number of cpu cores x 2 + 1`
Reference: [https://docs.gunicorn.org/en/stable/design.html#how-many-workers](https://docs.gunicorn.org/en/stable/design.html#how-many-workers)
- SERVER_WORKER_CLASS
Defaults to gevent. If using windows, it can be switched to sync or solo.
- GUNICORN_TIMEOUT
Request handling timeout. Default is 200. Recommended value is 360 to support longer SSE (Server-Sent Events) connection times.
- CELERY_WORKER_CLASS
Similar to `SERVER_WORKER_CLASS`. Default is gevent. If using windows, it can be switched to sync or solo.
- CELERY_WORKER_AMOUNT
The number of Celery workers. The default is 1, and can be set as needed.
#### Database Configuration
The database uses PostgreSQL. Please use the public schema.
- DB_USERNAME: username
- DB_PASSWORD: password
- DB_HOST: database host
- DB_PORT: database port number, default is 5432
- DB_DATABASE: database name
- SQLALCHEMY_POOL_SIZE: The size of the database connection pool. The default is 30 connections, which can be appropriately increased.
- SQLALCHEMY_POOL_RECYCLE: Database connection pool recycling time, the default is 3600 seconds.
- SQLALCHEMY_ECHO: Whether to print SQL, default is false.
#### Redis Configuration
This Redis configuration is used for caching and for pub/sub during conversation.
- REDIS_HOST: Redis host
- REDIS_PORT: Redis port, default is 6379
- REDIS_DB: Redis Database, default is 0. Please use a different Database from Session Redis and Celery Broker.
- REDIS_USERNAME: Redis username, default is empty
- REDIS_PASSWORD: Redis password, default is empty. It is strongly recommended to set a password.
- REDIS_USE_SSL: Whether to use SSL protocol for connection, default is false
- REDIS_USE_SENTINEL: Use Redis Sentinel to connect to Redis servers
- REDIS_SENTINELS: Sentinel nodes, format: `<sentinel1_ip>:<sentinel1_port>,<sentinel2_ip>:<sentinel2_port>,<sentinel3_ip>:<sentinel3_port>`
- REDIS_SENTINEL_SERVICE_NAME: Sentinel service name, same as Master Name
- REDIS_SENTINEL_USERNAME: Username for Sentinel
- REDIS_SENTINEL_PASSWORD: Password for Sentinel
- REDIS_SENTINEL_SOCKET_TIMEOUT: Sentinel timeout, default value: 0.1, unit: seconds
#### Celery Configuration
- CELERY_BROKER_URL
Format as follows(direct connection mode):
```
redis://<redis_username>:<redis_password>@<redis_host>:<redis_port>/<redis_database>
```
Example: `redis://:difyai123456@redis:6379/1`
Sentinel mode:
```
sentinel://<sentinel_username>:<sentinel_password>@<sentinel_host>:<sentinel_port>/<redis_database>
```
Example: `sentinel://localhost:26379/1;sentinel://localhost:26380/1;sentinel://localhost:26381/1`
- BROKER_USE_SSL
If set to true, use SSL protocol for connection, default is false
- CELERY_USE_SENTINEL
If set to true, Sentinel mode will be enabled, default is false
- CELERY_SENTINEL_MASTER_NAME
The service name of Sentinel, i.e., Master Name
- CELERY_SENTINEL_SOCKET_TIMEOUT
Timeout for connecting to Sentinel, default value: 0.1, unit: seconds
#### CORS Configuration
Used to set the front-end cross-domain access policy.
- CONSOLE_CORS_ALLOW_ORIGINS
Console CORS cross-domain policy, default is `*`, that is, all domains can access.
- WEB_API_CORS_ALLOW_ORIGINS
WebAPP CORS cross-domain policy, default is `*`, that is, all domains can access.
#### File Storage Configuration
Used to store uploaded data set files, team/tenant encryption keys, and other files.
- STORAGE_TYPE
Type of storage facility
- local (default)
Local file storage, if this option is selected, the following `STORAGE_LOCAL_PATH` configuration needs to be set.
- s3
S3 object storage, if this option is selected, the following S3\_ prefixed configurations need to be set.
- azure-blob
Azure Blob object storage, if this option is selected, the following AZURE_BLOB\_ prefixed configurations need to be set.
- huawei-obs
Huawei OBS object storage, if this option is selected, the following HUAWEI_OBS\_ prefixed configurations need to be set.
- volcengine-tos
Volcengine TOS object storage, if this option is selected, the following VOLCENGINE_TOS\_ prefixed configurations need to be set.
- STORAGE_LOCAL_PATH
Default is storage, that is, it is stored in the storage directory of the current directory.
If you are deploying with docker or docker-compose, be sure to mount the `/app/api/storage` directory in both containers to the same local directory, otherwise, you may encounter file not found errors.
- S3_ENDPOINT: S3 endpoint address
- S3_BUCKET_NAME: S3 bucket name
- S3_ACCESS_KEY: S3 Access Key
- S3_SECRET_KEY: S3 Secret Key
- S3_REGION: S3 region information, such as: us-east-1
- AZURE_BLOB_ACCOUNT_NAME: your-account-name eg, 'difyai'
- AZURE_BLOB_ACCOUNT_KEY: your-account-key eg, 'difyai'
- AZURE_BLOB_CONTAINER_NAME: your-container-name eg, 'difyai-container'
- AZURE_BLOB_ACCOUNT_URL: 'https://\<your_account_name>.blob.core.windows.net'
- ALIYUN_OSS_BUCKET_NAME: your-bucket-name eg, 'difyai'
- ALIYUN_OSS_ACCESS_KEY: your-access-key eg, 'difyai'
- ALIYUN_OSS_SECRET_KEY: your-secret-key eg, 'difyai'
- ALIYUN_OSS_ENDPOINT: https://oss-ap-southeast-1-internal.aliyuncs.com # reference: https://www.alibabacloud.com/help/en/oss/user-guide/regions-and-endpoints
- ALIYUN_OSS_REGION: ap-southeast-1 # reference: https://www.alibabacloud.com/help/en/oss/user-guide/regions-and-endpoints
- ALIYUN_OSS_AUTH_VERSION: v4
- ALIYUN_OSS_PATH: your-path # Don't start with '/'. OSS doesn't support leading slash in object names. reference: https://www.alibabacloud.com/help/en/oss/support/0016-00000005
- HUAWEI_OBS_BUCKET_NAME: your-bucket-name eg, 'difyai'
- HUAWEI_OBS_SECRET_KEY: your-secret-key eg, 'difyai'
- HUAWEI_OBS_ACCESS_KEY: your-access-key eg, 'difyai'
- HUAWEI_OBS_SERVER: your-server-url # reference: https://support.huaweicloud.com/sdk-python-devg-obs/obs_22_0500.html
- VOLCENGINE_TOS_BUCKET_NAME: your-bucket-name eg, 'difyai'
- VOLCENGINE_TOS_SECRET_KEY: your-secret-key eg, 'difyai'
- VOLCENGINE_TOS_ACCESS_KEY: your-access-key eg, 'difyai'
- VOLCENGINE_TOS_REGION: your-region eg, 'cn-guangzhou' # reference: https://www.volcengine.com/docs/6349/107356
- VOLCENGINE_TOS_ENDPOINT: your-endpoint eg, 'tos-cn-guangzhou.volces.com' # reference: https://www.volcengine.com/docs/6349/107356
#### Vector Database Configuration
- VECTOR_STORE
- **Available enumeration types include**
- `weaviate`
- `qdrant`
- `milvus`
- `zilliz` (share the same configuration as `milvus`)
- `myscale`
- `pinecone` (not yet open)
- `analyticdb`
- `couchbase`
- WEAVIATE_ENDPOINT
Weaviate endpoint address, such as: `http://weaviate:8080`.
- WEAVIATE_API_KEY
The api-key credential used to connect to Weaviate.
- WEAVIATE_BATCH_SIZE
The number of index Objects created in batches in Weaviate, default is 100.
Refer to this document: [https://weaviate.io/developers/weaviate/manage-data/import#how-to-set-batch-parameters](https://weaviate.io/developers/weaviate/manage-data/import#how-to-set-batch-parameters)
- WEAVIATE_GRPC_ENABLED
Whether to use the gRPC method to interact with Weaviate, performance will greatly increase when enabled, may not be usable locally, default is true.
- QDRANT_URL
Qdrant endpoint address, such as: `https://your-qdrant-cluster-url.qdrant.tech/`
- QDRANT_API_KEY
The api-key credential used to connect to Qdrant.
- PINECONE_API_KEY
The api-key credential used to connect to Pinecone.
- PINECONE_ENVIRONMENT
The environment where Pinecone is located, such as: `us-east4-gcp`
- MILVUS_URI
Milvus uri configuration. e.g. `http://host.docker.internal:19530`. For [Zilliz Cloud](https://docs.zilliz.com/docs/free-trials), adjust the uri and token to the Public Endpoint and API Key.
- MILVUS_TOKEN
Milvus token configuration, default is empty.
- MILVUS_USER
Milvus user configuration, default is empty.
- MILVUS_PASSWORD
Milvus password configuration, default is empty.
- MYSCALE_HOST
MyScale host configuration.
- MYSCALE_PORT
MyScale port configuration.
- MYSCALE_USER
MyScale user configuration, default is `default`.
- MYSCALE_PASSWORD
MyScale password configuration, default is empty.
- MYSCALE_DATABASE
MyScale database configuration, default is `default`.
- MYSCALE_FTS_PARAMS
MyScale text-search params, check [MyScale docs](https://myscale.com/docs/en/text-search/#understanding-fts-index-parameters) for multi-language support, default is empty.
- ANALYTICDB_KEY_ID
The access key ID used for Aliyun OpenAPI authentication. Read the [Analyticdb documentation](https://help.aliyun.com/zh/analyticdb/analyticdb-for-postgresql/support/create-an-accesskey-pair) to create your AccessKey.
- ANALYTICDB_KEY_SECRET
The access key secret used for Aliyun OpenAPI authentication.
- ANALYTICDB_INSTANCE_ID
The unique identifier for your AnalyticDB instance, such as : `gp-xxxxxx`. Read the [Analyticdb documentation](https://help.aliyun.com/zh/analyticdb/analyticdb-for-postgresql/getting-started/create-an-instance-1) to create your instance.
- ANALYTICDB_REGION_ID
The region identifier where the AnalyticDB instance is located, such as: `cn-hangzhou`.
- ANALYTICDB_ACCOUNT
The account name used to connect to the AnalyticDB instance. Read the [Analyticdb documentation](https://help.aliyun.com/zh/analyticdb/analyticdb-for-postgresql/getting-started/createa-a-privileged-account) to create an account.
- ANALYTICDB_PASSWORD
The password for the account used to connect to the AnalyticDB instance.
- ANALYTICDB_NAMESPACE
The namespace(schema) within the AnalyticDB instance that you wish to interact with, such as `dify`. If this namespace does not exist, it will be created automatically.
- ANALYTICDB_NAMESPACE_PASSWORD
The password for the namespace(schema). If the namespace does not exist, it will be created with this password.
- COUCHBASE_CONNECTION_STRING
The connection string for the Couchbase cluster.
- COUCHBASE_USER
The username for the database user.
- COUCHBASE_PASSWORD
The password for the database user.
- COUCHBASE_BUCKET_NAME
The name of the bucket to use.
- COUCHBASE_SCOPE_NAME
The name of the scope to use.
#### Knowledge Configuration
- UPLOAD_FILE_SIZE_LIMIT:
Upload file size limit, default 15M.
- UPLOAD_FILE_BATCH_LIMIT
The maximum number of files that can be uploaded at a time, default 5.
- ETL_TYPE
**Available enumeration types include:**
- dify
Dify's proprietary file extraction scheme
- Unstructured
Unstructured.io file extraction scheme
- UNSTRUCTURED_API_URL
Unstructured API path, needs to be configured when ETL_TYPE is Unstructured.
For example: `http://unstructured:8000/general/v0/general`
#### Multi-modal Configuration
- MULTIMODAL_SEND_IMAGE_FORMAT
The format of the image sent when the multi-modal model is input, the default is `base64`, optional `url`. The delay of the call in `url` mode will be lower than that in `base64` mode. It is generally recommended to use the more compatible `base64` mode. If configured as `url`, you need to configure `FILES_URL` as an externally accessible address so that the multi-modal model can access the image.
- UPLOAD_IMAGE_FILE_SIZE_LIMIT
Upload image file size limit, default 10M.
#### Sentry Configuration
Used for application monitoring and error log tracking.
- SENTRY_DSN
Sentry DSN address, default is empty, when empty, all monitoring information is not reported to Sentry.
- SENTRY_TRACES_SAMPLE_RATE
The reporting ratio of Sentry events, if it is 0.01, it is 1%.
- SENTRY_PROFILES_SAMPLE_RATE
The reporting ratio of Sentry profiles, if it is 0.01, it is 1%.
#### Notion Integration Configuration
Notion integration configuration variables can be obtained by applying for Notion integration: [https://www.notion.so/my-integrations](https://www.notion.so/my-integrations)
- NOTION_INTEGRATION_TYPE: Configure as "public" or "internal". Since Notion's OAuth redirect URL only supports HTTPS, if deploying locally, please use Notion's internal integration.
- NOTION_CLIENT_SECRET: Notion OAuth client secret (used for public integration type)
- NOTION_CLIENT_ID: OAuth client ID (used for public integration type)
- NOTION_INTERNAL_SECRET: Notion internal integration secret. If the value of `NOTION_INTEGRATION_TYPE` is "internal", you need to configure this variable.
#### Mail related configuration
- MAIL_TYPE
- resend
- MAIL_DEFAULT_SEND_FROM\
The sender's email name, such as: no-reply [no-reply@dify.ai](mailto:no-reply@dify.ai), not mandatory.
- RESEND_API_KEY\
API-Key for the Resend email provider, can be obtained from API-Key.
- smtp
- SMTP_SERVER\
SMTP server address
- SMTP_PORT\
SMTP server port number
- SMTP_USERNAME\
SMTP username
- SMTP_PASSWORD\
SMTP password
- SMTP_USE_TLS\
Whether to use TLS, default is false
- MAIL_DEFAULT_SEND_FROM\
The sender's email name, such as: no-reply [no-reply@dify.ai](mailto:no-reply@dify.ai), not mandatory.
#### ModelProvider & Tool Position Configuration
Used to specify the model providers and tools that can be used in the app. These settings allow you to customize which tools and model providers are available, as well as their order and inclusion/exclusion in the app's interface.
For a list of available [tools](https://github.com/langgenius/dify/blob/main/api/core/tools/provider/_position.yaml) and [model providers](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/model_providers/_position.yaml), please refer to the provided links.
- POSITION_TOOL_PINS
Pin specific tools to the top of the list, ensuring they appear first in the interface. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_TOOL_PINS=bing,google`
- POSITION_TOOL_INCLUDES
Specify the tools to be included in the app. Only the tools listed here will be available for use. If not set, all tools will be included unless specified in POSITION_TOOL_EXCLUDES. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_TOOL_INCLUDES=bing,google`
- POSITION_TOOL_EXCLUDES
Exclude specific tools from being displayed or used in the app. Tools listed here will be omitted from the available options, except for pinned tools. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_TOOL_EXCLUDES=yahoo,wolframalpha`
- POSITION_PROVIDER_PINS
Pin specific model providers to the top of the list, ensuring they appear first in the interface. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_PROVIDER_PINS=openai,openllm`
- POSITION_PROVIDER_INCLUDES
Specify the model providers to be included in the app. Only the providers listed here will be available for use. If not set, all providers will be included unless specified in POSITION_PROVIDER_EXCLUDES. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_PROVIDER_INCLUDES=cohere,upstage`
- POSITION_PROVIDER_EXCLUDES
Exclude specific model providers from being displayed or used in the app. Providers listed here will be omitted from the available options, except for pinned providers. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_PROVIDER_EXCLUDES=openrouter,ollama`
#### Others
- INVITE_EXPIRY_HOURS: Member invitation link valid time (hours), Default: 72.
- HTTP\_REQUEST\_NODE_MAX\_TEXT\_SIZEThe maximum text size of the HTTP request node in the workflow, default 1MB。
- HTTP\_REQUEST\_NODE\_MAX\_BINARY\_SIZEThe maximum binary size of HTTP request nodes in the workflow, default 10MB。
---
### Web Frontend
#### SENTRY_DSN
Sentry DSN address, default is empty, when empty, all monitoring information is not reported to Sentry.
## Deprecated
#### CONSOLE_URL
> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by: `CONSOLE_API_URL` and `CONSOLE_WEB_URL`.
Console URL, used to concatenate the authorization callback, console front-end address, and CORS configuration use. If empty, it is the same domain. Example: `https://console.dify.ai`.
#### API_URL
> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by `SERVICE_API_URL`.
API URL, used to display Service API Base URL to the front-end. If empty, it is the same domain. Example: `https://api.dify.ai`
#### APP_URL
> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by `APP_API_URL` and `APP_WEB_URL`.
WebApp Url, used to display WebAPP API Base Url to the front-end. If empty, it is the same domain. Example: `https://udify.app/`
#### Session Configuration
> ⚠️ This configuration is no longer valid since v0.3.24.
Only used by the API service for interface identity verification.
- SESSION_TYPE
Session component type
- redis (default)
If you choose this, you need to set the environment variables starting with SESSION_REDIS\_ below.
- sqlalchemy
If you choose this, the current database connection will be used and the sessions table will be used to read and write session records.
- SESSION_REDIS_HOST: Redis host
- SESSION_REDIS_PORT: Redis port, default is 6379
- SESSION_REDIS_DB: Redis Database, default is 0. Please use a different Database from Redis and Celery Broker.
- SESSION_REDIS_USERNAME: Redis username, default is empty
- SESSION_REDIS_PASSWORD: Redis password, default is empty. It is strongly recommended to set a password.
- SESSION_REDIS_USE_SSL: Whether to use SSL protocol for connection, default is false
#### Cookie Policy Configuration
> ⚠️ This configuration is no longer valid since v0.3.24.
Used to set the browser policy for session cookies used for identity verification.
- COOKIE_HTTPONLY
Cookie HttpOnly configuration, default is true.
- COOKIE_SAMESITE
Cookie SameSite configuration, default is Lax.
- COOKIE_SECURE
Cookie Secure configuration, default is false.
### Chunk Length Configuration
#### MAXIMUM_CHUNK_TOKEN_LENGTH
Configuration for document chunk length. It is used to control the size of text segments when processing long documents. Default: 500. Maximum: 4000.
**Larger Chunks**
- Retain more context within each chunk, ideal for tasks requiring a broader understanding of the text.
- Reduce the total number of chunks, lowering processing time and storage overhead.
**Smaller Chunks**
- Provide finer granularity, improving accuracy for tasks like extraction or summarization.
- Reduce the risk of exceeding model token limits, making it safer for models with stricter constraints.
**Configuration Recommendations**
- Choose larger chunks for context-heavy tasks like sentiment analysis or document summarization.
- Choose smaller chunks for fine-grained tasks such as keyword extraction or paragraph-level processing.

View File

@@ -0,0 +1,68 @@
# FAQs
### 1. Not receiving reset password emails
You need to configure the `Mail` parameters in the `.env` file. For detailed instructions, please refer to ["Environment Variables Explanation: Mail-related configuration"](https://docs.dify.ai/getting-started/install-self-hosted/environments#mail-related-configuration).
After modifying the configuration, run the following commands to restart the service:
```bash
docker compose down
docker compose up -d
```
If you still haven't received the email, please check if the email service is working properly and whether the email has been placed in the trash list.
### 2. How to handle if the workflow is too complex and exceeds the node limit?
In the community edition, you can manually adjust the MAX\_TREE\_DEPTH limit for single branch depth in `web/app/components/workflow/constants.ts.` Our default value is 50, and it's important to note that excessively deep branches may affect performance in self-hosted scenarios.
### 3. How to specify the runtime for each workflow node?
You can modify the `TEXT_GENERATION_TIMEOUT_MS` variable in the `.env` file to adjust the runtime for each node. This helps prevent overall application service unavailability caused by certain processes timing out.
### 4. How to reset the password of the admin account?
If you deployed using Docker Compose, you can reset the password with the following command while your Docker Compose is running:
```
docker exec -it docker-api-1 flask reset-password
```
It will prompt you to enter the email address and the new password. Example:
```
dify@my-pc:~/hello/dify/docker$ docker compose up -d
[+] Running 9/9
✔ Container docker-web-1 Started 0.1s
✔ Container docker-sandbox-1 Started 0.1s
✔ Container docker-db-1 Started 0.1s
✔ Container docker-redis-1 Started 0.1s
✔ Container docker-weaviate-1 Started 0.1s
✔ Container docker-ssrf_proxy-1 Started 0.1s
✔ Container docker-api-1 Started 0.1s
✔ Container docker-worker-1 Started 0.1s
✔ Container docker-nginx-1 Started 0.1s
dify@my-pc:~/hello/dify/docker$ docker exec -it docker-api-1 flask reset-password
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
sagemaker.config INFO - Not applying SDK defaults from location: /etc/xdg/sagemaker/config.yaml
sagemaker.config INFO - Not applying SDK defaults from location: /root/.config/sagemaker/config.yaml
Email: hello@dify.ai
New password: newpassword4567
Password confirm: newpassword4567
Password reset successfully.
```
### 5. How to Change the Port
If you're using Docker Compose, you can customize the access port by modifying the `.env` configuration file.
You need to modify the Nginx configuration:
```json
EXPOSE_NGINX_PORT=80
EXPOSE_NGINX_SSL_PORT=443
```
Other self-host issue please check this document [Self-Host Related](../../learn-more/faq/install-faq.md)。

View File

@@ -0,0 +1,187 @@
# FAQ
### 1. How to reset the password if the local deployment initialization fails with an incorrect password?
If deployed using docker compose, you can execute the following command to reset the password: `docker exec -it docker-api-1 flask reset-password` Enter the account email and twice new passwords, and it will be reset.
### 2. How to resolve File not found error in the log when deploying locally?
```
ERROR:root:Unknown Error in completion
Traceback (most recent call last):
File "/www/wwwroot/dify/dify/api/libs/rsa.py", line 45, in decrypt
private_key = storage.load(filepath)
File "/www/wwwroot/dify/dify/api/extensions/ext_storage.py", line 65, in load
raise FileNotFoundError("File not found")
FileNotFoundError: File not found
```
This error may be caused by switching deployment methods, or deleting the `api/storage/privkeys` file, which is used to encrypt large model keys and can not be reversed if lost. You can reset the encryption public and private keys with the following command:
* Docker compose deployment
```
docker exec -it docker-api-1 flask reset-encrypt-key-pair
```
* Source code startup
Enter the api directory
```
flask reset-encrypt-key-pair
```
Follow the prompts to reset.
### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
This may be due to switching the domain name/website, causing cross-domain between front-end and server-side. Cross-domain and identity involve two configuration items:
**CORS cross-domain configuration**
`CONSOLE_CORS_ALLOW_ORIGINS` Console CORS cross-domain policy, default to `*`, which allows access from all domain names. `WEB_API_CORS_ALLOW_ORIGINS` WebAPP CORS cross-domain strategy, default to `*`, which allows access from all domain names.
### 4. After starting, the page keeps loading and checking the request prompts CORS error?
This may be because the domain name/URL has been switched, resulting in cross-domain between the front end and the back end. Please change all the following configuration items in `docker-compose.yml` to the new domain name: `CONSOLE_API_URL:` The backend URL of the console API. `CONSOLE_WEB_URL:` The front-end URL of the console web. `SERVICE_API_URL:` Service API Url `APP_API_URL:` WebApp API backend Url. `APP_WEB_URL:` WebApp Url.
For more information, please check out: [Environments](environments.md)
### 5. How to upgrade version after deployment?
If you start up through images, please pull the latest images to complete the upgrade. If you start up through source code, please pull the latest code and then start up to complete the upgrade.
When deploying and updating local source code, you need to enter the API directory and execute the following command to migrate the database structure to the latest version:
`flask db upgrade`
### 6.How to configure the environment variables when use Notion import
**Q: What is the Notion's Integration configuration address?**
A: [https://www.notion.so/my-integrations](https://www.notion.so/my-integrations)
**Q: Which environment variables need to be configured**
A: Please set below configuration when doing the privatized deployment
1. **`NOTION_INTEGRATION_TYPE`** : The value should configrate as (**public/internal**). Since the Redirect address of Notions Oauth only supports https, if it is deployed locally, please use Notions internal integration
2. **`NOTION_CLIENT_SECRET`** : Notion OAuth client secret (userd for public integration type)
3. **`NOTION_CLIENT_ID`** : OAuth client ID (userd for public integration type)
4. **`NOTION_INTERNAL_SECRET`** : Notion Internal Integration Secret, If the value of `NOTION_INTEGRATION_TYPE` is **internal** ,you need to configure this variable.
### 7. How to change the name of the space in the local deployment version?
Modify in the `tenants` table in the database.
### 8. Where can I modify the domain name for accessing the application?
Find the configuration domain name APP\_WEB\_URL in `docker_compose.yaml`.
### 9. If database migration is required, what things need to be backed up?
The database, configured storage, and vector database data need to be backed up. If deployed in Docker Compose mode, all data content in the `dify/docker/volumes` directory can be directly backed up.
### 10. Why is Docker deploying Dify and starting OpenLLM locally using 127.0.0.1, but unable to access the local port?
`127.0.0.1` is the internal address of the container, and the server address configured by Dify requires the host LAN IP address.
### 11. How to solve the size and quantity limitations for uploading knowledge documents in the local deployment version
You can refer to the official website environment variable description document to configure:
[Environments](environments.md)
### 12. How does the local deployment edition invite members through email?
Local deployment edition, members can be invited through email. After entering the email invitation, the page displays the invitation link, copies the invitation link, and forwards it to users. Your team members can open the link and log in to your space by setting a password through email login.
### 13. How to solve listen tcp4 0.0.0.0:80: bind: address already in use?
This is because the port is occupied. You can use the `netstat -tunlp | grep 80` command to view the process that occupies the port, and then kill the process. For example, the apache and nginx processes occupy the port, you can use the `service apache2 stop` and `service nginx stop` commands to stop the process.
### 14. What to do if this error occurs in text-to-speech?
```
[openai] Error: ffmpeg is not installed
```
Since OpenAI TTS has implemented audio stream segmentation, ffmpeg needs to be installed for normal use when deploying the source code. Here are the detailed steps:
**Windows:**
1. Visit the [FFmpeg official website](https://ffmpeg.org/download.html) and download the precompiled Windows shared library.
2. Download and unzip the FFmpeg folder, which will generate a folder similar to "ffmpeg-20200715-51db0a4-win64-static".
3. Move the unzipped folder to a location of your choice, for example, C:\Program Files.
4. Add the absolute path of the FFmpeg bin directory to the system environment variables.
5. Open the command prompt and enter "ffmpeg -version" to see if the FFmpeg version information is displayed, indicating successful installation.
**Ubuntu:**
1. Open the terminal.
2. Enter the following commands to install FFmpeg: `sudo apt-get update`, then enter `sudo apt-get install ffmpeg`.
3. Enter "ffmpeg -version" to check if it has been successfully installed.
**CentOS:**
1. First, you need to enable the EPEL repository. In the terminal, enter: `sudo yum install epel-release`
2. Then, enter: `sudo rpm -Uvh http://li.nux.ro/download/nux/dextop/el7/x86_64/nux-dextop-release-0-5.el7.nux.noarch.rpm`
3. Update the yum package, enter: `sudo yum update`
4. Finally, install FFmpeg, enter: `sudo yum install ffmpeg ffmpeg-devel`
5. Enter "ffmpeg -version" to check if it has been successfully installed.
**Mac OS X:**
1. Open the terminal.
2. If you haven't installed Homebrew yet, you can install it by entering the following command in the terminal: `/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"`
3. Install FFmpeg with Homebrew, enter: `brew install ffmpeg`
4. Enter "ffmpeg -version" to check if it has been successfully installed.
### 15. Migrate Vector Database to Another Vector Database
If you want to migrate the vector database from weaviate to another vector database, you need to migrate the data in the vector database. The following is the migration method:
Step:
1. If you are starting from local source code, modify the environment variable in the `.env` file to the vector database you want to migrate to. etc: `VECTOR_STORE=qdrant`
2. If you are starting from docker-compose, modify the environment variable in the `docker-compose.yaml` file to the vector database you want to migrate to, both api and worker are all needed. etc:
```
# The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`, `analyticdb`.
VECTOR_STORE: weaviate
```
3. run the below command in your terminal or docker container
```
flask vdb-migrate # or docker exec -it docker-api-1 flask vdb-migrate
```
**Tested target database:**
- qdrant
- milvus
- analyticdb
### 16. Why is SSRF_PROXY Needed?
You may have noticed the `SSRF_PROXY` environment variable in the `docker-compose.yaml` file. This is crucial because the local deployment of Dify uses `SSRF_PROXY` to prevent Server-Side Request Forgery (SSRF) attacks. For more details on SSRF attacks, refer to [this resource](https://portswigger.net/web-security/ssrf).
To reduce potential risks, we have set up a proxy for all services that could be vulnerable to SSRF attacks. This proxy ensures that services like Sandbox can only access external networks through it, thereby protecting your data and services. By default, this proxy does not intercept any local requests. However, you can customize the proxy's behavior by modifying the `squid` configuration file.
#### How to Customize the Proxy Behavior?
In the `docker/volumes/ssrf_proxy/squid.conf` file, you will find the configuration settings for the proxy. For example, if you want to allow the `192.168.101.0/24` network to be accessed by the proxy, but restrict access to an IP address `192.168.101.19` that contains sensitive data, you can add the following rules to `squid.conf`:
```plaintext
acl restricted_ip dst 192.168.101.19
acl localnet src 192.168.101.0/24
http_access deny restricted_ip
http_access allow localnet
http_access deny all
```
This is a basic example, and you can customize the rules to fit your specific needs. For more information about configuring `squid`, refer to the [official documentation](http://www.squid-cache.org/Doc/config/).

View File

@@ -0,0 +1,248 @@
# Local Source Code Start
## Prerequisites
> Before installing Dify, make sure your machine meets the following minimum system requirements:
> - CPU >= 2 Core
> - RAM >= 4 GiB
| Operating System | Software | Explanation |
| -------------------------- | -------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| macOS 10.14 or later | Docker Desktop | Set the Docker virtual machine (VM) to use a minimum of 2 virtual CPUs (vCPUs) and 8 GB of initial memory. Otherwise, the installation may fail. For more information, please refer to the [Docker Desktop installation guide for Mac](https://docs.docker.com/desktop/mac/install/). |
| Linux platforms | <p>Docker 19.03 or later<br>Docker Compose 1.25.1 or later</p> | Please refer to the [Docker installation guide](https://docs.docker.com/engine/install/) and [the Docker Compose installation guide](https://docs.docker.com/compose/install/) for more information on how to install Docker and Docker Compose, respectively. |
| Windows with WSL 2 enabled | <p>Docker Desktop<br></p> | We recommend storing the source code and other data that is bound to Linux containers in the Linux file system rather than the Windows file system. For more information, please refer to the [Docker Desktop installation guide for using the WSL 2 backend on Windows.](https://docs.docker.com/desktop/windows/install/#wsl-2-backend) |
> If you need to use OpenAI TTS, `FFmpeg` must be installed on the system for it to function properly. For more details, refer to: [Link](https://docs.dify.ai/getting-started/install-self-hosted/install-faq#id-14.-what-to-do-if-this-error-occurs-in-text-to-speech).
### Clone Dify
```Bash
git clone https://github.com/langgenius/dify.git
```
Before enabling business services, we need to first deploy PostgreSQL / Redis / Weaviate (if not locally available). We can start them with the following commands:
```Bash
cd docker
cp middleware.env.example middleware.env
docker compose -f docker-compose.middleware.yaml up -d
```
---
### Server Deployment
- API Interface Service
- Worker Asynchronous Queue Consumption Service
#### Installation of the basic environment:
Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
To install additional Python versions, use pyenv install.
```Bash
pyenv install 3.12
```
To switch to the "3.12" Python environment, use the following command:
```Bash
pyenv global 3.12
```
#### Follow these steps :
1. Navigate to the "api" directory:
```
cd api
```
> For macOS: install libmagic with `brew install libmagic`.
1. Copy the environment variable configuration file:
```
cp .env.example .env
```
2. Generate a random secret key and replace the value of SECRET_KEY in the .env file:
```
awk -v key="$(openssl rand -base64 42)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
```
3. Install the required dependencies:
Dify API service uses [Poetry](https://python-poetry.org/docs/) to manage dependencies. You can execute `poetry shell` to activate the environment.
```
poetry env use 3.12
poetry install
```
4. Perform the database migration:
Perform database migration to the latest version:
```
poetry shell
flask db upgrade
```
5. Start the API server:
```
flask run --host 0.0.0.0 --port=5001 --debug
```
output
```
* Debug mode: on
INFO:werkzeug:WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5001
INFO:werkzeug:Press CTRL+C to quit
INFO:werkzeug: * Restarting with stat
WARNING:werkzeug: * Debugger is active!
INFO:werkzeug: * Debugger PIN: 695-801-919
```
6. Start the Worker service
To consume asynchronous tasks from the queue, such as dataset file import and dataset document updates, follow these steps to start the Worker service on Linux or macOS:
```
celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace
```
If you are using a Windows system to start the Worker service, please use the following command instead:
```
celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO
```
output:
```
-------------- celery@TAKATOST.lan v5.2.7 (dawn-chorus)
--- ***** -----
-- ******* ---- macOS-10.16-x86_64-i386-64bit 2023-07-31 12:58:08
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: app:0x7fb568572a10
- ** ---------- .> transport: redis://:**@localhost:6379/1
- ** ---------- .> results: postgresql://postgres:**@localhost:5432/dify
- *** --- * --- .> concurrency: 1 (gevent)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> dataset exchange=dataset(direct) key=dataset
.> generation exchange=generation(direct) key=generation
.> mail exchange=mail(direct) key=mail
[tasks]
. tasks.add_document_to_index_task.add_document_to_index_task
. tasks.clean_dataset_task.clean_dataset_task
. tasks.clean_document_task.clean_document_task
. tasks.clean_notion_document_task.clean_notion_document_task
. tasks.create_segment_to_index_task.create_segment_to_index_task
. tasks.deal_dataset_vector_index_task.deal_dataset_vector_index_task
. tasks.document_indexing_sync_task.document_indexing_sync_task
. tasks.document_indexing_task.document_indexing_task
. tasks.document_indexing_update_task.document_indexing_update_task
. tasks.enable_segment_to_index_task.enable_segment_to_index_task
. tasks.generate_conversation_summary_task.generate_conversation_summary_task
. tasks.mail_invite_member_task.send_invite_member_mail_task
. tasks.remove_document_from_index_task.remove_document_from_index_task
. tasks.remove_segment_from_index_task.remove_segment_from_index_task
. tasks.update_segment_index_task.update_segment_index_task
. tasks.update_segment_keyword_index_task.update_segment_keyword_index_task
[2023-07-31 12:58:08,831: INFO/MainProcess] Connected to redis://:**@localhost:6379/1
[2023-07-31 12:58:08,840: INFO/MainProcess] mingle: searching for neighbors
[2023-07-31 12:58:09,873: INFO/MainProcess] mingle: all alone
[2023-07-31 12:58:09,886: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6379/1.
[2023-07-31 12:58:09,890: INFO/MainProcess] celery@TAKATOST.lan ready.
```
---
## Deploy the frontend page
Start the web frontend client page service
#### Installation of the basic environment:
To start the web frontend service, you will need [Node.js v18.x (LTS)](http://nodejs.org/) and [NPM version 8.x.x](https://www.npmjs.com/) or [Yarn](https://yarnpkg.com/).
- Install NodeJS + NPM
Please visit [https://nodejs.org/en/download](https://nodejs.org/en/download) and choose the installation package for your respective operating system that is v18.x or higher. It is recommended to download the stable version, which includes NPM by default.
#### Follow these steps :
1. Enter the web directory
```
cd web
```
2. Install the dependencies.
```
npm install
```
3. Configure the environment variables. Create a file named .env.local in the current directory and copy the contents from .env.example. Modify the values of these environment variables according to your requirements:
```
# For production release, change this to PRODUCTION
NEXT_PUBLIC_DEPLOY_ENV=DEVELOPMENT
# The deployment edition, SELF_HOSTED or CLOUD
NEXT_PUBLIC_EDITION=SELF_HOSTED
# The base URL of console application, refers to the Console base URL of WEB service if console domain is
# different from api or web app domain.
# example: http://cloud.dify.ai/console/api
NEXT_PUBLIC_API_PREFIX=http://localhost:5001/console/api
# The URL for Web APP, refers to the Web App base URL of WEB service if web app domain is different from
# console or api domain.
# example: http://udify.app/api
NEXT_PUBLIC_PUBLIC_API_PREFIX=http://localhost:5001/api
# SENTRY
NEXT_PUBLIC_SENTRY_DSN=
NEXT_PUBLIC_SENTRY_ORG=
NEXT_PUBLIC_SENTRY_PROJECT=
```
4. Build the code
```
npm run build
```
5. Start the web service
```
npm run start
# or
yarn start
# or
pnpm start
```
After successful startup, the terminal will output the following information
```
ready - started server on 0.0.0.0:3000, url: http://localhost:3000
warn - You have enabled experimental feature (appDir) in next.config.js.
warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use at your own risk.
info - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback
```
### Access Dify
Finally, access [http://127.0.0.1:3000](http://127.0.0.1:3000/) to use the locally deployed Dify.

View File

@@ -0,0 +1,26 @@
# Start the frontend Docker container separately
When developing the backend separately, you may only need to start the backend service from source code without building and launching the frontend locally. In this case, you can directly start the frontend service by pulling the Docker image and running the container. Here are the specific steps:
#### Pull the Docker image for the frontend service from DockerHub:
```Bash
docker run -it -p 3000:3000 -e CONSOLE_URL=http://127.0.0.1:5001 -e APP_URL=http://127.0.0.1:5001 langgenius/dify-web:latest
```
#### Build Docker Image from Source Code
1. Build the frontend image
```
cd web && docker build . -t dify-web
```
2. Start the frontend image
```
docker run -it -p 3000:3000 -e CONSOLE_URL=http://127.0.0.1:5001 -e APP_URL=http://127.0.0.1:5001 dify-web
```
3. When the console domain and web app domain are different, you can set the CONSOLE_URL and APP_URL separately
4. To access it locally, you can visit [http://127.0.0.1:3000](http://127.0.0.1:3000/)

51
en-us/introduction.mdx Normal file
View File

@@ -0,0 +1,51 @@
---
title: Welcome to Dify
description: "Welcome to the home of your new documentation"
---
Dify is an open-source platform for building AI applications. We combine Backend-as-a-Service and LLMOps to streamline the development of generative AI solutions, making it accessible to both developers and non-technical innovators.
Our platform integrates:
- Support for mainstream LLMs
- An intuitive Prompt orchestration interface
- High-quality RAG engines
- A flexible AI Agent framework
- An Intuitive Low-code Workflow
- Easy-to-use interfaces and APIs
With Dify, you can skip the complexity and focus on what matters most - creating innovative AI applications that solve real-world problems.
### The Advantage of Dify
While many AI development tools offer individual components, Dify provides a comprehensive, production-ready solution. Think of Dify as a well-designed scaffolding system, not just a toolbox.
As an open-source platform, Dify is co-created by a dedicated professional team and a vibrant community. This collaboration ensures rapid iteration, robust features, and a user-friendly interface.
With Dify, you can:
- Deploy capabilities similar to Assistants API and GPTs using any model
- Maintain full control over your data with flexible security options
- Leverage an intuitive interface for easy management and deployment
### Dify
<Info>
The name Dify comes from "Define + Modify", referring to defining and continuously improving your AI applications. It's made for you.
</Info>
Here's how various groups are leveraging Dify:
1. **Startups**: Rapidly prototype and iterate on AI ideas, accelerating both successes and failures. Numerous teams have used Dify to build MVPs, secure funding, and win customer contracts.
2. **Established Businesses**: Enhance existing applications with LLM capabilities. Use Dify's RESTful APIs to separate prompts from business logic, while utilizing our management interface to track data, costs, and usage.
3. **Enterprise AI infrastructure**: Banks and tech companies are deploying Dify as an internal LLM gateway, facilitating GenAI adoption with centralized governance.
4. **AI Enthusiasts and Learners**: Practice prompt engineering and explore agent technologies with ease. Over 60,000 developers built their first AI app on Dify even before GPTs were introduced. Since then, our community has grown significantly, now boasting over 180,000 developers and supporting 59,000+ end users.
Whether you're a startup founder, an enterprise developer, or an AI enthusiast, Dify is designed to meet your needs and accelerate your AI journey!
### Next Steps
- Read [**Quick Start**](https://docs.dify.ai/application/creating-an-application) for an overview of Difys application building workflow.
- Learn how to [**self-deploy Dify** ](https://docs.dify.ai/getting-started/install-self-hosted)to your servers and [**integrate open source models**](https://docs.dify.ai/advanced/model-configuration)**.**
- Understand Difys [**specifications and roadmap**](https://docs.dify.ai/getting-started/readme/features-and-specifications)**.**
- [**Star us on GitHub**](https://github.com/langgenius/dify) and read our **Contributor Guidelines.**

393
en-us/model-providers.md Normal file
View File

@@ -0,0 +1,393 @@
# List of Model Providers
Dify supports the below model providers out-of-box:
<table data-full-width="false">
<thead>
<tr>
<th align="center">Provider</th>
<th align="center">LLM</th>
<th align="center">Text Embedding</th>
<th align="center">Rerank</th>
<th align="center">Speech to text</th>
<th align="center">TTS</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center">OpenAI</td>
<td align="center">✔️(🛠️)(👓)</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center">✔️</td>
</tr>
<tr>
<td align="center">Anthropic</td>
<td align="center">✔️(🛠️)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Azure OpenAI</td>
<td align="center">✔️(🛠️)(👓)</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center">✔️</td>
</tr>
<tr>
<td align="center">Gemini</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Google Cloud</td>
<td align="center">✔️(👓)</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Nvidia API Catalog</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Nvidia NIM</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Nvidia Triton Inference Server</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">AWS Bedrock</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">OpenRouter</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Cohere</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">together.ai</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Ollama</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Mistral AI</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">groqcloud</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Replicate</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Hugging Face</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Xorbits inference</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
</tr>
<tr>
<td align="center">Zhipu AI</td>
<td align="center">✔️(🛠️)(👓)</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Baichuan</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Spark</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Minimax</td>
<td align="center">✔️(🛠️)</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Tongyi</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center">✔️</td>
</tr>
<tr>
<td align="center">Wenxin</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Moonshot AI</td>
<td align="center">✔️(🛠️)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Tencent Cloud</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Stepfun</td>
<td align="center">✔️(🛠️)(👓)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">VolcanoEngine</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">01.AI</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">360 Zhinao</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Azure AI Studio</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">deepseek</td>
<td align="center">✔️(🛠️)</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Tencent Hunyuan</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">SILICONFLOW</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Jina AI</td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">ChatGLM</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Xinference</td>
<td align="center">✔️(🛠️)(👓)</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">OpenLLM</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">LocalAI<td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
</tr>
<tr>
<td align="center">OpenAI API-Compatible</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center"></td>
</tr>
<tr>
<td align="center">PerfXCloud</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Lepton AI</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">novita.ai</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Amazon Sagemaker</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">Text Embedding Inference</td>
<td align="center"></td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
<tr>
<td align="center">GPUStack</td>
<td align="center">✔️(🛠️)(👓)</td>
<td align="center">✔️</td>
<td align="center">✔️</td>
<td align="center"></td>
<td align="center"></td>
</tr>
</tbody>
</table>
where (🛠️) denotes "function calling" and (👓) denotes "support for vision".
---
This table is continuously updated. We also keep track of model providers requested by community members [here](https://github.com/langgenius/dify/discussions/categories/ideas). If you'd like to see a model provider not listed above, please consider contributing by making a PR. To learn more, check out our [contribution.md](../../community/contribution.md "mention") Guide.