| Please refer to the [aaPanel installation guide](https://www.aapanel.com/new/download.html#install) for more information on how to install aaPanel. |
-
-## Deployment
-
-1. Log in to aaPanel and click `Docker` in the menu bar
-
-2. The first time you will be prompted to install the `Docker` and `Docker Compose` services, click Install Now. If it is already installed, please ignore it.
-
-3. After the installation is complete, find `Dify` in `One-Click Install` and click `install`
-
-4. configure basic information such as the domain name, ports to complete the installation
-> \[!IMPORTANT]
->
-> The domain name is optional, if the domain name is filled, it can be managed through [Website]--> [Proxy Project], and you do not need to check [Allow external access] after filling in the domain name, otherwise you need to check it before you can access it through the port
-
-5. After installation, enter the domain name or IP+ port set in the previous step in the browser to access.
-- Name: application name, default `Dify-characters`
-- Version selection: default `latest`
-- Domain name: If you need to access directly through the domain name, please configure the domain name here and resolve the domain name to the server
-- Allow external access: If you need direct access through `IP+Port`, please check. If you have set up a domain name, please do not check here.
-- Port: Default `8088`, can be modified by yourself
-
-
-6. After submission, the panel will automatically initialize the application, which will take about `1-3` minutes. It can be accessed after the initialization is completed.
-
-### Access Dify
-
-Access administrator initialization page to set up the admin account:
-
-```bash
-# If you have set domain
-http://yourdomain/install
-
-# If you choose to access through `IP+Port`
-http://your_server_ip:8088/install
-```
-
-Dify web interface address:
-
-```bash
-# If you have set domain
-http://yourdomain/
-
-# If you choose to access through `IP+Port`
-http://your_server_ip:8088/
-```
diff --git a/en-us/getting-started/install-self-hosted/docker-compose.md b/en-us/getting-started/install-self-hosted/docker-compose.md
deleted file mode 100644
index c43d230e..00000000
--- a/en-us/getting-started/install-self-hosted/docker-compose.md
+++ /dev/null
@@ -1,149 +0,0 @@
-# Deploy with Docker Compose
-
-## Prerequisites
-
-> Before installing Dify, make sure your machine meets the following minimum system requirements:
->
-> * CPU >= 2 Core
-> * RAM >= 4 GiB
-
-| Operating System | Software | Explanation |
-| -------------------------- | ------------------------------------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| macOS 10.14 or later | Docker Desktop | Set the Docker virtual machine (VM) to use a minimum of 2 virtual CPUs (vCPUs) and 8 GB of initial memory. Otherwise, the installation may fail. For more information, please refer to the [Docker Desktop installation guide for Mac](https://docs.docker.com/desktop/mac/install/). |
-| Linux platforms |
Docker 19.03 or later Docker Compose 1.28 or later
| Please refer to the [Docker installation guide](https://docs.docker.com/engine/install/) and [the Docker Compose installation guide](https://docs.docker.com/compose/install/) for more information on how to install Docker and Docker Compose, respectively. |
-| Windows with WSL 2 enabled | Docker Desktop | We recommend storing the source code and other data that is bound to Linux containers in the Linux file system rather than the Windows file system. For more information, please refer to the [Docker Desktop installation guide for using the WSL 2 backend on Windows.](https://docs.docker.com/desktop/windows/install/#wsl-2-backend) |
-
-> \[!IMPORTANT]
->
-> Dify 0.6.12 has introduced significant enhancements to Docker Compose deployment, designed to improve your setup and update experience. For more information, read the [README.md](https://github.com/langgenius/dify/blob/main/docker/README.md).
-
-### Clone Dify
-
-Clone the Dify source code to your local machine:
-
-```bash
-# Assuming current latest version is 0.15.3
-git clone https://github.com/langgenius/dify.git --branch 0.15.3
-```
-
-### Starting Dify
-
-1. Navigate to the Docker directory in the Dify source code
-
- ```bash
- cd dify/docker
- ```
-2. Copy the environment configuration file
-
- ```bash
- cp .env.example .env
- ```
-3. Start the Docker containers
-
- Choose the appropriate command to start the containers based on the Docker Compose version on your system. You can use the `$ docker compose version` command to check the version, and refer to the [Docker documentation](https://docs.docker.com/compose/install/) for more information:
-
- * If you have Docker Compose V2, use the following command:
-
- ```bash
- docker compose up -d
- ```
-
- * If you have Docker Compose V1, use the following command:
-
- ```bash
- docker-compose up -d
- ```
-
-After executing the command, you should see output similar to the following, showing the status and port mappings of all containers:
-
-```bash
-[+] Running 11/11
- ✔ Network docker_ssrf_proxy_network Created 0.1s
- ✔ Network docker_default Created 0.0s
- ✔ Container docker-redis-1 Started 2.4s
- ✔ Container docker-ssrf_proxy-1 Started 2.8s
- ✔ Container docker-sandbox-1 Started 2.7s
- ✔ Container docker-web-1 Started 2.7s
- ✔ Container docker-weaviate-1 Started 2.4s
- ✔ Container docker-db-1 Started 2.7s
- ✔ Container docker-api-1 Started 6.5s
- ✔ Container docker-worker-1 Started 6.4s
- ✔ Container docker-nginx-1 Started 7.1s
-```
-
-Finally, check if all containers are running successfully:
-
-```bash
-docker compose ps
-```
-
-This includes 3 core services: `api / worker / web`, and 6 dependent components: `weaviate / db / redis / nginx / ssrf_proxy / sandbox` .
-
-```bash
-NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
-docker-api-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" api About a minute ago Up About a minute 5001/tcp
-docker-db-1 postgres:15-alpine "docker-entrypoint.s…" db About a minute ago Up About a minute (healthy) 5432/tcp
-docker-nginx-1 nginx:latest "sh -c 'cp /docker-e…" nginx About a minute ago Up About a minute 0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp
-docker-redis-1 redis:6-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp
-docker-sandbox-1 langgenius/dify-sandbox:0.2.1 "/main" sandbox About a minute ago Up About a minute
-docker-ssrf_proxy-1 ubuntu/squid:latest "sh -c 'cp /docker-e…" ssrf_proxy About a minute ago Up About a minute 3128/tcp
-docker-weaviate-1 semitechnologies/weaviate:1.19.0 "/bin/weaviate --hos…" weaviate About a minute ago Up About a minute
-docker-web-1 langgenius/dify-web:0.6.13 "/bin/sh ./entrypoin…" web About a minute ago Up About a minute 3000/tcp
-docker-worker-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" worker About a minute ago Up About a minute 5001/tcp
-```
-
-With these steps, you should be able to install Dify successfully.
-
-### Upgrade Dify
-
-Enter the docker directory of the dify source code and execute the following commands:
-
-```bash
-cd dify/docker
-docker compose down
-git pull origin main
-docker compose pull
-docker compose up -d
-```
-
-#### Sync Environment Variable Configuration (Important)
-
-* If the `.env.example` file has been updated, be sure to modify your local `.env` file accordingly.
-* Check and modify the configuration items in the `.env` file as needed to ensure they match your actual environment. You may need to add any new variables from `.env.example` to your `.env` file, and update any values that have changed.
-
-### Access Dify
-
-Access administrator initialization page to set up the admin account:
-
-```bash
-# Local environment
-http://localhost/install
-
-# Server environment
-http://your_server_ip/install
-```
-
-Dify web interface address:
-
-```bash
-# Local environment
-http://localhost
-
-# Server environment
-http://your_server_ip
-```
-
-### Customize Dify
-
-Edit the environment variable values in your `.env` file directly. Then, restart Dify with:
-
-```
-docker compose down
-docker compose up -d
-```
-
-The full set of annotated environment variables along can be found under docker/.env.example.
-
-### Read More
-
-If you have any questions, please refer to [FAQs](faqs.md).
diff --git a/en-us/getting-started/install-self-hosted/install-faq.md b/en-us/getting-started/install-self-hosted/install-faq.md
deleted file mode 100644
index 035517b1..00000000
--- a/en-us/getting-started/install-self-hosted/install-faq.md
+++ /dev/null
@@ -1,187 +0,0 @@
-# FAQ
-
-### 1. How to reset the password if the local deployment initialization fails with an incorrect password?
-
-If deployed using docker compose, you can execute the following command to reset the password: `docker exec -it docker-api-1 flask reset-password` Enter the account email and twice new passwords, and it will be reset.
-
-### 2. How to resolve File not found error in the log when deploying locally?
-
-```
-ERROR:root:Unknown Error in completion
-Traceback (most recent call last):
- File "/www/wwwroot/dify/dify/api/libs/rsa.py", line 45, in decrypt
- private_key = storage.load(filepath)
- File "/www/wwwroot/dify/dify/api/extensions/ext_storage.py", line 65, in load
- raise FileNotFoundError("File not found")
-FileNotFoundError: File not found
-```
-
-This error may be caused by switching deployment methods, or deleting the `api/storage/privkeys` file, which is used to encrypt large model keys and can not be reversed if lost. You can reset the encryption public and private keys with the following command:
-
-* Docker compose deployment
-
-```
-docker exec -it docker-api-1 flask reset-encrypt-key-pair
-```
-
-* Source code startup
-
-Enter the api directory
-
-```
-flask reset-encrypt-key-pair
-```
-
-Follow the prompts to reset.
-
-### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
-
-This may be due to switching the domain name/website, causing cross-domain between front-end and server-side. Cross-domain and identity involve two configuration items:
-
-**CORS cross-domain configuration**
-
-`CONSOLE_CORS_ALLOW_ORIGINS` Console CORS cross-domain policy, default to `*`, which allows access from all domain names. `WEB_API_CORS_ALLOW_ORIGINS` WebAPP CORS cross-domain strategy, default to `*`, which allows access from all domain names.
-
-### 4. After starting, the page keeps loading and checking the request prompts CORS error?
-
-This may be because the domain name/URL has been switched, resulting in cross-domain between the front end and the back end. Please change all the following configuration items in `docker-compose.yml` to the new domain name: `CONSOLE_API_URL:` The backend URL of the console API. `CONSOLE_WEB_URL:` The front-end URL of the console web. `SERVICE_API_URL:` Service API Url `APP_API_URL:` WebApp API backend Url. `APP_WEB_URL:` WebApp Url.
-
-For more information, please check out: [Environments](environments.md)
-
-### 5. How to upgrade version after deployment?
-
-If you start up through images, please pull the latest images to complete the upgrade. If you start up through source code, please pull the latest code and then start up to complete the upgrade.
-
-When deploying and updating local source code, you need to enter the API directory and execute the following command to migrate the database structure to the latest version:
-
-`flask db upgrade`
-
-### 6.How to configure the environment variables when use Notion import
-
-**Q: What is the Notion's Integration configuration address?**
-
-A: [https://www.notion.so/my-integrations](https://www.notion.so/my-integrations)
-
-**Q: Which environment variables need to be configured?**
-
-A: Please set below configuration when doing the privatized deployment
-
-1. **`NOTION_INTEGRATION_TYPE`** : The value should configrate as (**public/internal**). Since the Redirect address of Notion’s Oauth only supports https, if it is deployed locally, please use Notion’s internal integration
-2. **`NOTION_CLIENT_SECRET`** : Notion OAuth client secret (userd for public integration type)
-3. **`NOTION_CLIENT_ID`** : OAuth client ID (userd for public integration type)
-4. **`NOTION_INTERNAL_SECRET`** : Notion Internal Integration Secret, If the value of `NOTION_INTEGRATION_TYPE` is **internal** ,you need to configure this variable.
-
-### 7. How to change the name of the space in the local deployment version?
-
-Modify in the `tenants` table in the database.
-
-### 8. Where can I modify the domain name for accessing the application?
-
-Find the configuration domain name APP\_WEB\_URL in `docker_compose.yaml`.
-
-### 9. If database migration is required, what things need to be backed up?
-
-The database, configured storage, and vector database data need to be backed up. If deployed in Docker Compose mode, all data content in the `dify/docker/volumes` directory can be directly backed up.
-
-### 10. Why is Docker deploying Dify and starting OpenLLM locally using 127.0.0.1, but unable to access the local port?
-
-`127.0.0.1` is the internal address of the container, and the server address configured by Dify requires the host LAN IP address.
-
-### 11. How to solve the size and quantity limitations for uploading knowledge documents in the local deployment version?
-
-You can refer to the official website environment variable description document to configure:
-
-[Environments](environments.md)
-
-### 12. How does the local deployment edition invite members through email?
-
-Local deployment edition, members can be invited through email. After entering the email invitation, the page displays the invitation link, copies the invitation link, and forwards it to users. Your team members can open the link and log in to your space by setting a password through email login.
-
-### 13. How to solve listen tcp4 0.0.0.0:80: bind: address already in use?
-
-This is because the port is occupied. You can use the `netstat -tunlp | grep 80` command to view the process that occupies the port, and then kill the process. For example, the apache and nginx processes occupy the port, you can use the `service apache2 stop` and `service nginx stop` commands to stop the process.
-
-### 14. What to do if this error occurs in text-to-speech?
-
-```
-[openai] Error: ffmpeg is not installed
-```
-
-Since OpenAI TTS has implemented audio stream segmentation, ffmpeg needs to be installed for normal use when deploying the source code. Here are the detailed steps:
-
-**Windows:**
-
-1. Visit the [FFmpeg official website](https://ffmpeg.org/download.html) and download the precompiled Windows shared library.
-2. Download and unzip the FFmpeg folder, which will generate a folder similar to "ffmpeg-20200715-51db0a4-win64-static".
-3. Move the unzipped folder to a location of your choice, for example, C:\Program Files.
-4. Add the absolute path of the FFmpeg bin directory to the system environment variables.
-5. Open the command prompt and enter "ffmpeg -version" to see if the FFmpeg version information is displayed, indicating successful installation.
-
-**Ubuntu:**
-
-1. Open the terminal.
-2. Enter the following commands to install FFmpeg: `sudo apt-get update`, then enter `sudo apt-get install ffmpeg`.
-3. Enter "ffmpeg -version" to check if it has been successfully installed.
-
-**CentOS:**
-
-1. First, you need to enable the EPEL repository. In the terminal, enter: `sudo yum install epel-release`
-2. Then, enter: `sudo rpm -Uvh http://li.nux.ro/download/nux/dextop/el7/x86_64/nux-dextop-release-0-5.el7.nux.noarch.rpm`
-3. Update the yum package, enter: `sudo yum update`
-4. Finally, install FFmpeg, enter: `sudo yum install ffmpeg ffmpeg-devel`
-5. Enter "ffmpeg -version" to check if it has been successfully installed.
-
-**Mac OS X:**
-
-1. Open the terminal.
-2. If you haven't installed Homebrew yet, you can install it by entering the following command in the terminal: `/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"`
-3. Install FFmpeg with Homebrew, enter: `brew install ffmpeg`
-4. Enter "ffmpeg -version" to check if it has been successfully installed.
-
-### 15. Migrate Vector Database to Another Vector Database
-
-If you want to migrate the vector database from weaviate to another vector database, you need to migrate the data in the vector database. The following is the migration method:
-
-Step:
-
-1. If you are starting from local source code, modify the environment variable in the `.env` file to the vector database you want to migrate to. etc: `VECTOR_STORE=qdrant`
-2. If you are starting from docker-compose, modify the environment variable in the `docker-compose.yaml` file to the vector database you want to migrate to, both api and worker are all needed. etc:
-
-```
-# The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`, `analyticdb`.
-VECTOR_STORE: weaviate
-```
-
-3. run the below command in your terminal or docker container
-
-```
-flask vdb-migrate # or docker exec -it docker-api-1 flask vdb-migrate
-```
-
-**Tested target database:**
-
-- qdrant
-- milvus
-- analyticdb
-
-### 16. Why is SSRF_PROXY Needed?
-
-You may have noticed the `SSRF_PROXY` environment variable in the `docker-compose.yaml` file. This is crucial because the local deployment of Dify uses `SSRF_PROXY` to prevent Server-Side Request Forgery (SSRF) attacks. For more details on SSRF attacks, refer to [this resource](https://portswigger.net/web-security/ssrf).
-
-To reduce potential risks, we have set up a proxy for all services that could be vulnerable to SSRF attacks. This proxy ensures that services like Sandbox can only access external networks through it, thereby protecting your data and services. By default, this proxy does not intercept any local requests. However, you can customize the proxy's behavior by modifying the `squid` configuration file.
-
-#### How to Customize the Proxy Behavior?
-
-In the `docker/volumes/ssrf_proxy/squid.conf` file, you will find the configuration settings for the proxy. For example, if you want to allow the `192.168.101.0/24` network to be accessed by the proxy, but restrict access to an IP address `192.168.101.19` that contains sensitive data, you can add the following rules to `squid.conf`:
-
-```plaintext
-acl restricted_ip dst 192.168.101.19
-acl localnet src 192.168.101.0/24
-
-http_access deny restricted_ip
-http_access allow localnet
-http_access deny all
-```
-
-This is a basic example, and you can customize the rules to fit your specific needs. For more information about configuring `squid`, refer to the [official documentation](http://www.squid-cache.org/Doc/config/).
-
diff --git a/en-us/getting-started/install-self-hosted/local-source-code.md b/en-us/getting-started/install-self-hosted/local-source-code.md
deleted file mode 100644
index c38487c1..00000000
--- a/en-us/getting-started/install-self-hosted/local-source-code.md
+++ /dev/null
@@ -1,248 +0,0 @@
-# Local Source Code Start
-
-## Prerequisites
-
-> Before installing Dify, make sure your machine meets the following minimum system requirements:
-> - CPU >= 2 Core
-> - RAM >= 4 GiB
-
-| Operating System | Software | Explanation |
-| -------------------------- | -------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| macOS 10.14 or later | Docker Desktop | Set the Docker virtual machine (VM) to use a minimum of 2 virtual CPUs (vCPUs) and 8 GB of initial memory. Otherwise, the installation may fail. For more information, please refer to the [Docker Desktop installation guide for Mac](https://docs.docker.com/desktop/mac/install/). |
-| Linux platforms |
Docker 19.03 or later Docker Compose 1.25.1 or later
| Please refer to the [Docker installation guide](https://docs.docker.com/engine/install/) and [the Docker Compose installation guide](https://docs.docker.com/compose/install/) for more information on how to install Docker and Docker Compose, respectively. |
-| Windows with WSL 2 enabled |
Docker Desktop
| We recommend storing the source code and other data that is bound to Linux containers in the Linux file system rather than the Windows file system. For more information, please refer to the [Docker Desktop installation guide for using the WSL 2 backend on Windows.](https://docs.docker.com/desktop/windows/install/#wsl-2-backend) |
-
-> If you need to use OpenAI TTS, `FFmpeg` must be installed on the system for it to function properly. For more details, refer to: [Link](https://docs.dify.ai/getting-started/install-self-hosted/install-faq#id-14.-what-to-do-if-this-error-occurs-in-text-to-speech).
-
-### Clone Dify
-
-```Bash
-git clone https://github.com/langgenius/dify.git
-```
-
-Before enabling business services, we need to first deploy PostgreSQL / Redis / Weaviate (if not locally available). We can start them with the following commands:
-
-```Bash
-cd docker
-cp middleware.env.example middleware.env
-docker compose -f docker-compose.middleware.yaml up -d
-```
-
----
-
-### Server Deployment
-
-- API Interface Service
-- Worker Asynchronous Queue Consumption Service
-
-#### Installation of the basic environment:
-
-Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
-
-To install additional Python versions, use pyenv install.
-
-```Bash
-pyenv install 3.12
-```
-
-To switch to the "3.12" Python environment, use the following command:
-
-```Bash
-pyenv global 3.12
-```
-
-#### Follow these steps :
-
-1. Navigate to the "api" directory:
-
- ```
- cd api
- ```
-
-> For macOS: install libmagic with `brew install libmagic`.
-
-1. Copy the environment variable configuration file:
-
- ```
- cp .env.example .env
- ```
-
-2. Generate a random secret key and replace the value of SECRET_KEY in the .env file:
-
- ```
- awk -v key="$(openssl rand -base64 42)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
- ```
-
-3. Install the required dependencies:
-
- Dify API service uses [Poetry](https://python-poetry.org/docs/) to manage dependencies. You can execute `poetry shell` to activate the environment.
-
- ```
- poetry env use 3.12
- poetry install
- ```
-
-4. Perform the database migration:
-
- Perform database migration to the latest version:
-
- ```
- poetry shell
- flask db upgrade
- ```
-
-5. Start the API server:
-
- ```
- flask run --host 0.0.0.0 --port=5001 --debug
- ```
-
- output:
-
- ```
- * Debug mode: on
- INFO:werkzeug:WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
- * Running on all addresses (0.0.0.0)
- * Running on http://127.0.0.1:5001
- INFO:werkzeug:Press CTRL+C to quit
- INFO:werkzeug: * Restarting with stat
- WARNING:werkzeug: * Debugger is active!
- INFO:werkzeug: * Debugger PIN: 695-801-919
- ```
-
-6. Start the Worker service
-
- To consume asynchronous tasks from the queue, such as dataset file import and dataset document updates, follow these steps to start the Worker service on Linux or macOS:
-
- ```
- celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace
- ```
-
- If you are using a Windows system to start the Worker service, please use the following command instead:
-
- ```
- celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO
- ```
-
- output:
-
- ```
- -------------- celery@TAKATOST.lan v5.2.7 (dawn-chorus)
- --- ***** -----
- -- ******* ---- macOS-10.16-x86_64-i386-64bit 2023-07-31 12:58:08
- - *** --- * ---
- - ** ---------- [config]
- - ** ---------- .> app: app:0x7fb568572a10
- - ** ---------- .> transport: redis://:**@localhost:6379/1
- - ** ---------- .> results: postgresql://postgres:**@localhost:5432/dify
- - *** --- * --- .> concurrency: 1 (gevent)
- -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
- --- ***** -----
- -------------- [queues]
- .> dataset exchange=dataset(direct) key=dataset
- .> generation exchange=generation(direct) key=generation
- .> mail exchange=mail(direct) key=mail
-
- [tasks]
- . tasks.add_document_to_index_task.add_document_to_index_task
- . tasks.clean_dataset_task.clean_dataset_task
- . tasks.clean_document_task.clean_document_task
- . tasks.clean_notion_document_task.clean_notion_document_task
- . tasks.create_segment_to_index_task.create_segment_to_index_task
- . tasks.deal_dataset_vector_index_task.deal_dataset_vector_index_task
- . tasks.document_indexing_sync_task.document_indexing_sync_task
- . tasks.document_indexing_task.document_indexing_task
- . tasks.document_indexing_update_task.document_indexing_update_task
- . tasks.enable_segment_to_index_task.enable_segment_to_index_task
- . tasks.generate_conversation_summary_task.generate_conversation_summary_task
- . tasks.mail_invite_member_task.send_invite_member_mail_task
- . tasks.remove_document_from_index_task.remove_document_from_index_task
- . tasks.remove_segment_from_index_task.remove_segment_from_index_task
- . tasks.update_segment_index_task.update_segment_index_task
- . tasks.update_segment_keyword_index_task.update_segment_keyword_index_task
-
- [2023-07-31 12:58:08,831: INFO/MainProcess] Connected to redis://:**@localhost:6379/1
- [2023-07-31 12:58:08,840: INFO/MainProcess] mingle: searching for neighbors
- [2023-07-31 12:58:09,873: INFO/MainProcess] mingle: all alone
- [2023-07-31 12:58:09,886: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6379/1.
- [2023-07-31 12:58:09,890: INFO/MainProcess] celery@TAKATOST.lan ready.
- ```
-
----
-
-## Deploy the frontend page
-
-Start the web frontend client page service
-
-#### Installation of the basic environment:
-
-To start the web frontend service, you will need [Node.js v18.x (LTS)](http://nodejs.org/) and [NPM version 8.x.x](https://www.npmjs.com/) or [Yarn](https://yarnpkg.com/).
-
-- Install NodeJS + NPM
-
-Please visit [https://nodejs.org/en/download](https://nodejs.org/en/download) and choose the installation package for your respective operating system that is v18.x or higher. It is recommended to download the stable version, which includes NPM by default.
-
-#### Follow these steps :
-
-1. Enter the web directory
-
- ```
- cd web
- ```
-
-2. Install the dependencies.
-
- ```
- npm install
- ```
-
-3. Configure the environment variables. Create a file named .env.local in the current directory and copy the contents from .env.example. Modify the values of these environment variables according to your requirements:
-
- ```
- # For production release, change this to PRODUCTION
- NEXT_PUBLIC_DEPLOY_ENV=DEVELOPMENT
- # The deployment edition, SELF_HOSTED or CLOUD
- NEXT_PUBLIC_EDITION=SELF_HOSTED
- # The base URL of console application, refers to the Console base URL of WEB service if console domain is
- # different from api or web app domain.
- # example: http://cloud.dify.ai/console/api
- NEXT_PUBLIC_API_PREFIX=http://localhost:5001/console/api
- # The URL for Web APP, refers to the Web App base URL of WEB service if web app domain is different from
- # console or api domain.
- # example: http://udify.app/api
- NEXT_PUBLIC_PUBLIC_API_PREFIX=http://localhost:5001/api
-
- # SENTRY
- NEXT_PUBLIC_SENTRY_DSN=
- NEXT_PUBLIC_SENTRY_ORG=
- NEXT_PUBLIC_SENTRY_PROJECT=
- ```
-
-4. Build the code
-
- ```
- npm run build
- ```
-
-5. Start the web service
-
- ```
- npm run start
- # or
- yarn start
- # or
- pnpm start
- ```
-
-After successful startup, the terminal will output the following information:
-
-```
-ready - started server on 0.0.0.0:3000, url: http://localhost:3000
-warn - You have enabled experimental feature (appDir) in next.config.js.
-warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use at your own risk.
-info - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback
-```
-
-### Access Dify
-
-Finally, access [http://127.0.0.1:3000](http://127.0.0.1:3000/) to use the locally deployed Dify.
diff --git a/en-us/getting-started/readme/features-and-specifications.md b/en-us/getting-started/readme/features-and-specifications.md
deleted file mode 100644
index 07cdbe5d..00000000
--- a/en-us/getting-started/readme/features-and-specifications.md
+++ /dev/null
@@ -1,17 +0,0 @@
----
-description: >-
- For those already familiar with LLM application tech stacks, this document
- serves as a shortcut to understand Dify's unique advantages
----
-
-# Features and Specifications
-
-We adopt transparent policies around product specifications to ensure decisions are made based on complete understanding. Such transparency not only benefits your technical selection, but also promotes deeper comprehension within the community for active contributions.
-
-### Project Basics
-
-
OpenAI Interface Standard Model Integration Supported
∞
Multimodal Capabilities
ASR Models
Rich-text models up to GPT-4o specs
Built-in App Types
Text generation, Chatbot, Agent, Workflow, Chatflow
Prompt-as-a-Service Orchestration
Visual orchestration interface widely praised, modify Prompts and preview effects in one place.
Orchestration Modes
Simple orchestration
Assistant orchestration
Flow orchestration
Prompt Variable Types
String
Radio enum
External API
File (Q3 2024)
Agentic Workflow Features
Industry-leading visual workflow orchestration interface, live-editing node debugging, modular DSL, and native code runtime, designed for building more complex, reliable, and stable LLM applications.
Supported Nodes
LLM
Knowledge Retrieval
Question Classifier
IF/ELSE
CODE
Template
HTTP Request
Tool
RAG Features
Industry-first visual knowledge base management interface, supporting snippet previews and recall testing.
Indexing Methods
Keywords
Text vectors
LLM-assisted question-snippet model
Retrieval Methods
Keywords
Text similarity matching
Hybrid Search
N choose 1(Legacy)
Multi-path retrieval
Recall Optimization
Rerank models
ETL Capabilities
Automated cleaning for TXT, Markdown, PDF, HTML, DOC, CSV formats. Unstructured service enables maximum support.
Sync Notion docs as knowledge bases. Sync Webpages as knowledge bases.
Based on human-annotated Q&As, used for similarity-based replies. Exportable as data format for model fine-tuning.
Content Moderation
OpenAI Moderation or external APIs
Team Collaboration
Workspaces, multi-member management
API Specs
RESTful, most features covered
Deployment Methods
Docker, Helm
diff --git a/en-us/getting-started/readme/model-providers.md b/en-us/getting-started/readme/model-providers.md
deleted file mode 100644
index 57f832a2..00000000
--- a/en-us/getting-started/readme/model-providers.md
+++ /dev/null
@@ -1,393 +0,0 @@
-# List of Model Providers
-
-Dify supports the below model providers out-of-box:
-
-
-
-
-
Provider
-
LLM
-
Text Embedding
-
Rerank
-
Speech to text
-
TTS
-
-
-
-
-
OpenAI
-
✔️(🛠️)(👓)
-
✔️
-
-
✔️
-
✔️
-
-
-
Anthropic
-
✔️(🛠️)
-
-
-
-
-
-
-
Azure OpenAI
-
✔️(🛠️)(👓)
-
✔️
-
-
✔️
-
✔️
-
-
-
Gemini
-
✔️
-
-
-
-
-
-
-
Google Cloud
-
✔️(👓)
-
✔️
-
-
-
-
-
-
Nvidia API Catalog
-
✔️
-
✔️
-
✔️
-
-
-
-
-
Nvidia NIM
-
✔️
-
-
-
-
-
-
-
Nvidia Triton Inference Server
-
✔️
-
-
-
-
-
-
-
AWS Bedrock
-
✔️
-
✔️
-
-
-
-
-
-
OpenRouter
-
✔️
-
-
-
-
-
-
-
Cohere
-
✔️
-
✔️
-
✔️
-
-
-
-
-
-
together.ai
-
✔️
-
-
-
-
-
-
-
Ollama
-
✔️
-
✔️
-
-
-
-
-
-
Mistral AI
-
✔️
-
-
-
-
-
-
-
groqcloud
-
✔️
-
-
-
-
-
-
-
Replicate
-
✔️
-
✔️
-
-
-
-
-
-
Hugging Face
-
✔️
-
✔️
-
-
-
-
-
-
Xorbits inference
-
✔️
-
✔️
-
✔️
-
✔️
-
✔️
-
-
-
Zhipu AI
-
✔️(🛠️)(👓)
-
✔️
-
-
-
-
-
-
Baichuan
-
✔️
-
✔️
-
-
-
-
-
-
Spark
-
✔️
-
-
-
-
-
-
-
Minimax
-
✔️(🛠️)
-
✔️
-
-
-
-
-
-
Tongyi
-
✔️
-
✔️
-
-
-
✔️
-
-
-
Wenxin
-
✔️
-
✔️
-
-
-
-
-
-
Moonshot AI
-
✔️(🛠️)
-
-
-
-
-
-
-
Tencent Cloud
-
-
-
-
✔️
-
-
-
-
Stepfun
-
✔️(🛠️)(👓)
-
-
-
-
-
-
-
VolcanoEngine
-
✔️
-
✔️
-
-
-
-
-
-
01.AI
-
✔️
-
-
-
-
-
-
-
360 Zhinao
-
✔️
-
-
-
-
-
-
-
Azure AI Studio
-
✔️
-
-
✔️
-
-
-
-
-
deepseek
-
✔️(🛠️)
-
-
-
-
-
-
-
Tencent Hunyuan
-
✔️
-
-
-
-
-
-
-
SILICONFLOW
-
✔️
-
✔️
-
-
-
-
-
-
Jina AI
-
-
✔️
-
✔️
-
-
-
-
-
ChatGLM
-
✔️
-
-
-
-
-
-
-
Xinference
-
✔️(🛠️)(👓)
-
✔️
-
✔️
-
-
-
-
-
OpenLLM
-
✔️
-
✔️
-
-
-
-
-
-
LocalAI
-
✔️
-
✔️
-
✔️
-
✔️
-
-
-
-
OpenAI API-Compatible
-
✔️
-
✔️
-
-
✔️
-
-
-
-
PerfXCloud
-
✔️
-
✔️
-
-
-
-
-
-
Lepton AI
-
✔️
-
-
-
-
-
-
-
novita.ai
-
✔️
-
-
-
-
-
-
-
Amazon Sagemaker
-
✔️
-
✔️
-
✔️
-
-
-
-
-
Text Embedding Inference
-
-
✔️
-
✔️
-
-
-
-
-
GPUStack
-
✔️(🛠️)(👓)
-
✔️
-
✔️
-
-
-
-
-
-
-where (🛠️) ︎ denotes "function calling" and (👓) denotes "support for vision".
-
----
-
-This table is continuously updated. We also keep track of model providers requested by community members [here](https://github.com/langgenius/dify/discussions/categories/ideas). If you'd like to see a model provider not listed above, please consider contributing by making a PR. To learn more, check out our [contribution.md](../../community/contribution.md "mention") Guide.
diff --git a/en-us/getting-started/cloud.md b/en/getting-started/cloud.mdx
similarity index 97%
rename from en-us/getting-started/cloud.md
rename to en/getting-started/cloud.mdx
index 6c02a275..0b864e68 100644
--- a/en-us/getting-started/cloud.md
+++ b/en/getting-started/cloud.mdx
@@ -1,10 +1,10 @@
-# Dify Cloud
+---
+title: Dify Cloud
+---
-
-
-{% hint style="info" %}
+
Note: Dify is currently in the Beta testing phase. If there are inconsistencies between the documentation and the product, please refer to the actual product experience.
-{% endhint %}
+
Dify can be used [out-of-box ](https://cloud.dify.ai/apps)as a cloud service by anyone. Explore the flexible [Plans and Pricing](https://dify.ai/pricing) and select the plan that best suits your needs and requirements.
diff --git a/en-us/getting-started/dify-premium-on-aws.md b/en/getting-started/dify-premium-on-aws.mdx
similarity index 97%
rename from en-us/getting-started/dify-premium-on-aws.md
rename to en/getting-started/dify-premium-on-aws.mdx
index 6c8b986c..4201b9bc 100644
--- a/en-us/getting-started/dify-premium-on-aws.md
+++ b/en/getting-started/dify-premium-on-aws.mdx
@@ -1,4 +1,6 @@
-# Dify Premium on AWS
+---
+title: Dify Premium on AWS
+---
Dify Premium is our AWS AMI offering that allows custom branding and is one-click deployable to your AWS VPC as an EC2. Head to [AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) to subscribe. It's useful in a couple of scenarios:
@@ -27,9 +29,7 @@ docker-compose -f docker-compose.yaml -f docker-compose.override.yaml up -d
> To upgrade to version v1.0.0, please refer to [Migrating Community Edition to v1.0.0](https://docs.dify.ai/development/migration/migrate-to-v1).
-
-
-Upgrading Community Edition to v1.0.0
+
The upgrade process involves the following steps:
@@ -98,13 +98,14 @@ poetry run flask install-plugins --workers=2
```
This command will download and install all necessary plugins into the latest Community Edition. When the terminal shows `Install plugins completed.`, the migration is complete.
-
+
+
## Customizing
Just like self-hosted deploy, you may modify the environment variables under `.env` in your EC2 instance as you see fit. Then, restart Dify with:
-```
+```bash
docker-compose down
docker-compose -f docker-compose.yaml -f docker-compose.override.yaml up -d
```
diff --git a/en/getting-started/install-self-hosted/bt-panel.mdx b/en/getting-started/install-self-hosted/bt-panel.mdx
new file mode 100644
index 00000000..32c826f7
--- /dev/null
+++ b/en/getting-started/install-self-hosted/bt-panel.mdx
@@ -0,0 +1,75 @@
+---
+title: Deploy with aaPanel
+---
+
+
+## Prerequisites
+
+> Before installing Dify, make sure your machine meets the following minimum system requirements:
+>
+> * CPU >= 2 Core
+> * RAM >= 4 GiB
+
+
+## Deployment
+
+1. Log in to aaPanel and click `Docker` in the menu bar
+
+2. The first time you will be prompteINLINE_CODE_P`Docker Compose` the `Docker` and `Docker Compose` services, click Install Now. If it is already installed, please ignore it.
+
+3INLINE_COD`One-Click Install`e ins`install` is complete, find `Dify` in `One-Click Install` and click `install`
+
+4. configure basic information such as the domain name, ports to complete the installation
+> \[!IMPORTANT]
+>
+> The domain name is optional, if the domain name is filled, it can be managed through [Website]--> [Proxy Project], and you do not need to check [Allow external access] after filling in the domain name, otherwise you need to check it before you can access it through the port
+
+5. After installation, enter the domain name or IP+ port s`Dify-characters`s step in the browser `latest`s.
+- Name: application name, default `Dify-characters`
+- Version selection: default `latest`
+- Domain name: If you need to access directly through the domain name, please configure the domain na`IP+Port`nd resolve the domain name to the server
+- Allow external access: If you nee`8088`ct access through `IP+Port`, please check. If you have set up a domain name, please do not check here.
+- Port: De`Docker`0 `8088`, can be modified by yourself
+
+
+6. After submission, the panel will automatically initialize the application, which will take about `1-3` minutes. It can be accessed after the initializa`Docker`1p the admin account:
+
+```bash
+# If you have set domain
+http://yourdomain/install
+
+# If you choose to access through `IP+Port`
+http://your_server_ip:8088/install
+```
+
+Dify web interface address:
+
+```bash
+# If you have set domain
+http://yourdomain/
+
+# If you choose to access through `IP+Port`
+http://your_server_ip```bash
+# If you have set domain
+http://yourdomain/
+
+# If you choose to access through `IP+Port`
+http://your_server_ip:8088/
+```
\ No newline at end of file
diff --git a/en/getting-started/install-self-hosted/docker-compose.mdx b/en/getting-started/install-self-hosted/docker-compose.mdx
new file mode 100644
index 00000000..ea0a8cd3
--- /dev/null
+++ b/en/getting-started/install-self-hosted/docker-compose.mdx
@@ -0,0 +1,133 @@
+---
+title: Deploy with Docker Compose
+---
+
+
+## Prerequisites
+
+> Before installing Dify, make sure your machine meets the following minimum system requirements:
+>
+> * CPU >= 2 Core
+> * RAM >= 4 GiB
+
+
+
+
+
Operating System
+
Software
+
Explanation
+
+
+
+
+
macOS 10.14 or later
+
Docker Desktop
+
Set the Docker virtual machine (VM) to use a minimum of 2 virtual CPUs (vCPUs) and 8 GB of initial memory. Otherwise, the installation may fail. For more information, please refer to the Docker Desktop installation guide for Mac.
+> \[!IMPORTANT]
+>
+> Dify 0.6.12 has introduced significant enhancements to Docker Compose deployment, designed to improve your setup and update experience. For more information, read the [README.md](https://github.com/langgenius/dify/blob/main/docker/README.md).
+
+### Clone Dify
+
+Clone the Dify source code to your local machine:
+
+```bash
+# Assuming current latest version is 0.15.3
+git clone https://github.com/langgenius/dify.git --branch 0.15.3
+```
+
+### Starting Dify
+
+1. Navigate to the Docker directory in the Dify source code
+
+ ```bash
+ cd dify/docker
+ ```
+2. `
+2. Copy the environment configuration file
+
+ `h
+ CODE_BLOCK_PLACEHOLDER`bash
+ ```bash
+ cd dify/docker
+ ```tart the Docker containers
+
+ Choose the appropriate comma```bash
+ cp .env.example .env
+ ```Docker Compose version on your system. You can use the `d to check the version, and refer to the [Docker documentation](https://docs.docker.com/compose/install/) for more information:
+
+ * If you have Docker Compose V2, use the` command to check the version, and refer to the [Docker documentation](https://docs.docker.com/compose/install/) for more information:
+
+ * If you have Docker Compose V2, use the following command:
+
+ `utput similar to the following, showing the status ```bash
+ docker-compose up -d
+ ````bash
+[+] Running 11/11
+ ✔ Network docker_ssrf_proxy_network Created `bash
+ docker compose up -d
+ `etwork dCODE`
+
+ * If you have Docker Compose V1, use the following command:
+
+ `ssrf_proxy / sandbox` .
+
+```bash
+NAME `bash
+ ```bash
+ docker compose up -d
+ ```cuting the command, you should see output similar to the following, showing the status ```bash
+ docker-compose up -d
+ ````pi-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" api About a minute ago Up About a minute 5001/tcp
+docker-db-1 postgres:15-alpine "docker-entrypoint.s…" db About a minute ago Up About a minute (healthy) 5432/tcp
+docker-nginx-`weaviate / db / redis / nginx / ssrf_proxy / sandbox` /docker-e…" nginx About a minute ago Up About a minute 0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp
+docker-redis-1 redis:6-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp
+docker-sandbox-1 langgenius/dify-sandbox:0.2.1 "/main" sandbox About a minute ago Up About a minute
+docker-ssrf_proxy-1 ubuntu/squid:latest "sh -c 'cp /docker-e…" ssrf_proxy About a minute ago Up About a minute 3128/tcp
+docker-weaviate-1 semitechnologies/weaviate:1.19.0 "/bin/weaviate --hos…" weaviate About a minute ago ```bash
+docker compose ps
+```
+docker-web-1 langgenius/dify-web:0.6.13 "/bin/sh ./entrypoin…" web About a minute ago Up About a minute ```bash
+NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
+docker-api-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" api About a minute ago Up About a minute 5001/tcp
+docker-db-1 postgres:15-alpine "docker-entrypoint.s…" db About a minute ago Up About a minute (healthy) 5432/tcp
+docker-nginx-1 nginx:latest "sh -c 'cp /docker-e…" nginx About a minute ago Up About a minute 0.0.0.0:80->80/tcp, :::80->80/tcp, 0.0.0.0:443->443/tcp, :::443->443/tcp
+docker-redis-1 redis:6-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp
+docker-sandbox-1 langgenius/dify-sandbox:0.2.1 "/main" sandbox About a minute ago Up About a minute
+docker-ssrf_proxy-1 ubuntu/squid:latest "sh -c 'cp /docker-e…" ssrf_proxy About a minute ago Up About a minute 3128/tcp
+docker-weaviate-1 semitechnologies/weaviate:1.19.0 "/bin/weaviate --hos…" weaviate About a minute ago Up About a minute
+docker-web-1 langgenius/dify-web:0.6.13 "/bin/sh ./entrypoin…" web About a minute ago Up About a minute 3000/tcp
+docker-worker-1 langgenius/dify-api:0.6.13 "/bin/bash /entrypoi…" worker About a minute ago Up About a minute 5001/tcp
+``````bash
+cd dify/docker
+docker compose down
+git pull origin main
+docker compose pull
+docker compose up -d
+``````bash
+# Local environment
+http://localhost/install
+
+# Server environment
+http://your_server_ip/install
+``````bash
+ cd dify/docker
+ ```0```bash
+ cd dify/docker
+ ```1
\ No newline at end of file
diff --git a/en-us/getting-started/install-self-hosted/environments.md b/en/getting-started/install-self-hosted/environments.mdx
similarity index 79%
rename from en-us/getting-started/install-self-hosted/environments.md
rename to en/getting-started/install-self-hosted/environments.mdx
index 35d270da..22821a2d 100644
--- a/en-us/getting-started/install-self-hosted/environments.md
+++ b/en/getting-started/install-self-hosted/environments.mdx
@@ -1,4 +1,7 @@
-# Environments
+---
+title: Environments
+---
+
### Common Variables
@@ -10,11 +13,11 @@ The backend URL for the console API. This is used to construct the authorization
The front-end URL of the console web interface. This is used to construct front-end addresses and for CORS configuration. If left empty, it defaults to the same domain as the application. Example: `https://console.dify.ai`
-#### SERVICE_API_URL
+## SERVICE_API_URL
The Service API URL, used to display Service API Base URL in the front-end. If left empty, it defaults to the same domain as the application. Example: `https://api.dify.ai`
-#### APP_API_URL
+## APP_API_URL
The WebApp API backend URL, used to specify the backend URL for the front-end API. If left empty, it defaults to the same domain as the application. Example: `https://app.dify.ai`
@@ -54,9 +57,7 @@ Flask debug mode: When enabled, it outputs trace information in the API response
A secret key used for securely signing session cookies and encrypting sensitive information in the database.
-This variable must be set before the first launch.
-
-Run `openssl rand -base64 42` to generate a strong key for it.
+This variable must be set before the first laun`openssl rand -base64 42`ase64 42` to generate a strong key for it.
#### DEPLOY_ENV
@@ -78,11 +79,11 @@ The log output level. Default is INFO. For production environments, it's recomme
When set to true, database migrations are automatically executed on container startup. This is only available when launched using docker and does not apply when running from source code.
-For source code launches, you need to manually run `flask db upgrade` in the api directory.
+For source code launches, you need to manu`flask db upgrade` upgrade` in the api directory.
#### CHECK_UPDATE_URL
-Controls the version checking policy. If set to false, the system will not call `https://updates.dify.ai` to check for updates.
+Controls the version checking policy. If set to false, the syste `https://updates.dify.ai` updates.dify.ai` to check for updates.
Currently, the version check interface based on CloudFlare Worker is not directly accessible in China. Setting this variable to an empty value will disable this API call.
@@ -110,7 +111,7 @@ Only effective when starting with docker image or docker-compose.
- SERVER_WORKER_AMOUNT
- The number of API server workers, i.e., the number of gevent workers. Formula: `number of cpu cores x 2 + 1`
+ The number of API server workers, i.e., the number of gevent wo`number of cpu cores x 2 + 1`u cores x 2 + 1`
Reference: [https://docs.gunicorn.org/en/stable/design.html#how-many-workers](https://docs.gunicorn.org/en/stable/design.html#how-many-workers)
@@ -124,7 +125,7 @@ Only effective when starting with docker image or docker-compose.
- CELERY_WORKER_CLASS
- Similar to `SERVER_WORKER_CLASS`. Default is gevent. If using windows, it can be switched to sync or solo.
+ `SERVER_WORKER_CLASS`ORKER_CLASS`. Default is gevent. If using windows, it can be switched to sync or solo.
- CELERY_WORKER_AMOUNT
@@ -154,7 +155,7 @@ This Redis configuration is used for caching and for pub/sub during conversation
- REDIS_PASSWORD: Redis password, default is empty. It is strongly recommended to set a password.
- REDIS_USE_SSL: Whether to use SSL protocol for connection, default is false
- REDIS_USE_SENTINEL: Use Redis Sentinel to connect to Redis servers
-- REDIS_SENTINELS: Sentinel nodes, format: `:,:,:`
+- REDIS_SENTINELS: Sentinel`https://console.dify.ai`0sentinel3_port>`
- REDIS_SENTINEL_SERVICE_NAME: Sentinel service name, same as Master Name
- REDIS_SENTINEL_USERNAME: Username for Sentinel
- REDIS_SENTINEL_PASSWORD: Password for Sentinel
@@ -173,17 +174,6 @@ This Redis configuration is used for caching and for pub/sub during conversation
Example: `redis://:difyai123456@redis:6379/1`
- Sentinel mode:
-
- ```
- sentinel://:@:/
- ```
-
- Example: `sentinel://localhost:26379/1;sentinel://localhost:26380/1;sentinel://localhost:26381/1`
-
-- BROKER_USE_SSL
-
- If set to true, use SSL protocol for connection, default is false
- CELERY_USE_SENTINEL
@@ -209,9 +199,9 @@ Used to set the front-end cross-domain access policy.
WebAPP CORS cross-domain policy, default is `*`, that is, all domains can access.
-#### File Storage Configuration
+#### Fil`https://console.dify.ai`3torage Configuration
-Used to store uploaded data set files, team/tenant encryption keys, and other files.
+Used to store uploaded data set files, team/tenant encryption keys,`https://console.dify.ai`4d other files.
- STORAGE_TYPE
@@ -223,7 +213,7 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
- s3
- S3 object storage, if this option is selected, the following S3\_ prefixed configurations need to be set.
+ S3 object storage, if this option is selected, the `https://console.dify.ai`5xed configurations need to be set.
- azure-blob
@@ -241,7 +231,7 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
Default is storage, that is, it is stored in the storage directory of the current directory.
- If you are deploying with docker or docker-compose, be sure to mount the `/app/api/storage` directory in both containers to the same local directory, otherwise, you may encounter file not found errors.
+ If you are deploying with docker or docker-compose, be sure to mount the `/app/api/storage` directory in both containers to the same local directory, otherwise, you may encounter file`https://console.dify.ai`6
- S3_ENDPOINT: S3 endpoint address
- S3_BUCKET_NAME: S3 bucket name
@@ -277,13 +267,10 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
- `qdrant`
- `milvus`
- `zilliz` (share the same configuration as `milvus`)
- - `myscale`
- - `pinecone` (not yet open)
- - `analyticdb`
+ INLINE_CODE_PLACEINLINE_CODE_PLAINLINE_CODE_PLA`https://api.dify.ai`0_19_187e`
+ - INLINE_CODE_PLACINLINE_CODE_PLAC`https://api.dify.ai`31e` (nINLINE_CODE_PLACEHO`https://api.dify.ai`5 - `analyticdb`
- `couchbase`
-- WEAVIATE_ENDPOINT
-
- Weaviate endpoint address, such as: `http://weaviate:8080`.
+- WEAVIATE_`https://api.dify.ai`6ndpoint address, such as: `http://weaviate:8080`.
- WEAVIATE_API_KEY
@@ -297,11 +284,7 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
- WEAVIATE_GRPC_ENABLED
- Whether to use the gRPC method to interact with Weaviate, performance will greatly increase when enabled, may not be usable locally, default is true.
-
-- QDRANT_URL
-
- Qdrant endpoint address, such as: `https://your-qdrant-cluster-url.qdrant.tech/`
+ Whether to use the gRPC method to interact with Weaviate, performance will greatly increase when enabled, may not be usable locally, default is true.`https://api.dify.ai`7h as: `https://your-qdrant-cluster-url.qdrant.tech/`
- QDRANT_API_KEY
@@ -313,11 +296,9 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
- PINECONE_ENVIRONMENT
- The environment where Pinecone is located, such as: `us-east4-gcp`
+ The environment wher`https://api.dify.ai`8located, such as: `us-east4-gcp`
-- MILVUS_URI
-
- Milvus uri configuration. e.g. `http://host.docker.internal:19530`. For [Zilliz Cloud](https://docs.zilliz.com/docs/free-trials), adjust the uri and token to the Public Endpoint and API Key.
+- M`https://api.dify.ai`9on. e.g. `http://host.docker.internal:19530`. For [Zilliz Cloud](https://docs.zilliz.com/docs/free-trials), adjust the uri and token to the Public Endpoint and API Key.
- MILVUS_TOKEN
@@ -341,15 +322,13 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
- MYSCALE_USER
- MyScale user configuration, default is `default`.
+ MySc`https://app.dify.ai`0configuration, default is `default`.
- MYSCALE_PASSWORD
MyScale password configuration, default is empty.
-- MYSCALE_DATABASE
-
- MyScale database configuration, default is `default`.
+- MYSCALE_DAT`https://app.dify.ai`1MyScale database configuration, default is `default`.
- MYSCALE_FTS_PARAMS
@@ -363,13 +342,9 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
The access key secret used for Aliyun OpenAPI authentication.
-- ANALYTICDB_INSTANCE_ID
+- ANALYTICDB_INSTANCE`https://app.dify.ai`2unique identifier for your AnalyticDB instance, such as : `gp-xxxxxx`. Read the [Analyticdb documentation](https://help.aliyun.com/zh/analyticdb/analyticdb-for-postgresql/getting-started/create-an-instance-1) to create your instance.
- The unique identifier for your AnalyticDB instance, such as : `gp-xxxxxx`. Read the [Analyticdb documentation](https://help.aliyun.com/zh/analyticdb/analyticdb-for-postgresql/getting-started/create-an-instance-1) to create your instance.
-
-- ANALYTICDB_REGION_ID
-
- The region identifier where the AnalyticDB instance is located, such as: `cn-hangzhou`.
+- ANALYTICDB_RE`https://app.dify.ai`3e region identifier where the AnalyticDB instance is located, such as: `cn-hangzhou`.
- ANALYTICDB_ACCOUNT
@@ -379,9 +354,7 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
The password for the account used to connect to the AnalyticDB instance.
-- ANALYTICDB_NAMESPACE
-
- The namespace(schema) within the AnalyticDB instance that you wish to interact with, such as `dify`. If this namespace does not exist, it will be created automatically.
+- ANALYTICDB_NAMESPACE`https://app.dify.ai`4e namespace(schema) within the AnalyticDB instance that you wish to interact with, such as `dify`. If this namespace does not exist, it will be created automatically.
- ANALYTICDB_NAMESPACE_PASSWORD
@@ -429,17 +402,13 @@ Used to store uploaded data set files, team/tenant encryption keys, and other fi
Unstructured.io file extraction scheme
-- UNSTRUCTURED_API_URL
-
- Unstructured API path, needs to be configured when ETL_TYPE is Unstructured.
+- `https://app.dify.ai`5, needs to be configured when ETL_TYPE is Unstructured.
For example: `http://unstructured:8000/general/v0/general`
#### Multi-modal Configuration
-- MULTIMODAL_SEND_IMAGE_FORMAT
-
- The format of the image sent when the multi-modal model is input, the default is `base64`, optional `url`. The delay of the call in `url` mode will be lower than that in `base64` mode. It is generally recommended to use the more compatible `base64` mode. If configured as `url`, you need to configure `FILES_URL` as an externally accessible address so that the multi-modal model can access the image.
+- MULTIMODAL_SENDINLINE_CODE_PLACEHO`https://app.dify.ai`736ORMA`https://app.dify.ai`8The format o`https://app.dify.ai`9age sent when the multi-modal model is input`https://udify.app/`0fault `https://udify.app/`1ase`https://udify.app/`2al `url`. The delay of the call in `url` mode will be lower than that in `base64` mode. It is generally recommended to use the more compatible `base64` mode. If configured as `url`, you need to configure `FILES_URL` as an externally accessible address so that the multi-modal model can access the image.
- UPLOAD_IMAGE_FILE_SIZE_LIMIT
@@ -466,7 +435,7 @@ Used for application monitoring and error log tracking.
Notion integration configuration variables can be obtained by applying for Notion integration: [https://www.notion.so/my-integrations](https://www.notion.so/my-integrations)
- NOTION_INTEGRATION_TYPE: Configure as "public" or "internal". Since Notion's OAuth redirect URL only supports HTTPS, if deploying locally, please use Notion's internal integration.
-- NOTION_CLIENT_SECRET: Notion OAuth client secret (used for public integration type)
+- NOT`https://udify.app/`3 OAuth client secret (used for public integration type)
- NOTION_CLIENT_ID: OAuth client ID (used for public integration type)
- NOTION_INTERNAL_SECRET: Notion internal integration secret. If the value of `NOTION_INTEGRATION_TYPE` is "internal", you need to configure this variable.
@@ -496,7 +465,7 @@ Notion integration configuration variables can be obtained by applying for Notio
Used to specify the model providers and tools that can be used in the app. These settings allow you to customize which tools and model providers are available, as well as their order and inclusion/exclusion in the app's interface.
-For a list of available [tools](https://github.com/langgenius/dify/blob/main/api/core/tools/provider/_position.yaml) and [model providers](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/model_providers/_position.yaml), please refer to the provided links.
+For a list of available [tools](https://github.com/langgenius/dify/blob/main/api/core/tools/provider/_position.yaml) and [model providers](https://github.com/langgenius/dify/blob/main/api/core/model_runtime/model_providers/_position.`https://udify.app/`4ded links.
- POSITION_TOOL_PINS
@@ -506,31 +475,29 @@ For a list of available [tools](https://github.com/langgenius/dify/blob/main/api
- POSITION_TOOL_INCLUDES
- Specify the tools to be included in the app. Only the tools listed here will be available for use. If not set, all tools will be included unless specified in POSITION_TOOL_EXCLUDES. (Use comma-separated values with **no spaces** between items.)
+ Specify the tools to be included`https://udify.app/`5ere will be available for use. If not set, all tools will be included unless specified in POSITION_TOOL_EXCLUDES. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_TOOL_INCLUDES=bing,google`
- POSITION_TOOL_EXCLUDES
- Exclude specific tools from being displayed or used in the app. Tools listed here will be omitted from the available options, except for pinned tools. (Use comma-separated values with **no spaces** between items.)
+ Exclude spe`https://udify.app/`6 the app. Tools listed here will be omitted from the available options, except for pinned tools. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_TOOL_EXCLUDES=yahoo,wolframalpha`
-- POSITION_PROVIDER_PINS
-
- Pin specific model providers to the top of the list, ensuring they appear first in the interface. (Use comma-separated values with **no spaces** between items.)
+- P`https://udify.app/`7odel providers to the top of the list, ensuring they appear first in the interface. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_PROVIDER_PINS=openai,openllm`
- POSITION_PROVIDER_INCLUDES
- Specify the model providers to be included in the app. Only the providers listed here will be available for use. If not set, all providers will be included unless specified in POSITION_PROVIDER_EXCLUDES. (Use comma-separated values with **no spaces** between items.)
+ Specify the model providers to be included in the app. Only the providers listed here will be `https://udify.app/`8s will be included unless specified in POSITION_PROVIDER_EXCLUDES. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_PROVIDER_INCLUDES=cohere,upstage`
- POSITION_PROVIDER_EXCLUDES
- Exclude specific model providers from being displayed or used in the app. Providers listed here will be omitted from the available options, except for pinned providers. (Use comma-separated values with **no spaces** between items.)
+ Exclude specific model providers from being displayed or used in the app. Pr`https://udify.app/`9vailable options, except for pinned providers. (Use comma-separated values with **no spaces** between items.)
Example: `POSITION_PROVIDER_EXCLUDES=openrouter,ollama`
@@ -546,25 +513,23 @@ For a list of available [tools](https://github.com/langgenius/dify/blob/main/api
#### SENTRY_DSN
-Sentry DSN address, default is empty, when empty, all monitoring information is not reported to Sentry.
+Sentry DSN address, default is empty, when empty, all moINLINE_CODE_PLACEHOLDE`openssl rand -base64 42`1ported to Sentry.
## Deprecated
#### CONSOLE_URL
-> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by: `CONSOLE_API_URL` and `CONSOLE_WEB_URL`.
-
-Console URL, used to concatenate the authorization callback, console front-end address, and CORS configuration use. If empty, it is the same domain. Example: `https://console.dify.ai`.
+> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by: `CONSOLE_API_URL` and `CONSOLE_WEB`openssl rand -base64 42`2 to concatenate the authorization callback, console front-end address, and CORS c`openssl rand -base64 42`3 If empty, it is the same domain. Example: `https://console.dify.ai`.
#### API_URL
-> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by `SERVICE_API_URL`.
+> ⚠️ Modified i`openssl rand -base64 42`4ecated in 0.4.9, replaced by `SERVICE_API_URL`.
-API URL, used to display Service API Base URL to the front-end. If empty, it is the same domain. Example: `https://api.dify.ai`
+API URL, used to display SeINLINE_CODE_PLACEH`openssl rand -base64 42`6 to the front-end. If empty, it is the same domain. Example: `https://api.dify.ai`
#### APP_URL
-> ⚠️ Modified in 0.3.8, will be deprecated in 0.4.9, replaced by `APP_API_URL` and `APP_WEB_URL`.
+`openssl rand -base64 42`7.8, will be deprecated in 0.4.9, replaced by `APP_API_URL` and `APP_WEB_URL`.
WebApp Url, used to display WebAPP API Base Url to the front-end. If empty, it is the same domain. Example: `https://udify.app/`
diff --git a/en-us/getting-started/install-self-hosted/faqs.md b/en/getting-started/install-self-hosted/faqs.mdx
similarity index 76%
rename from en-us/getting-started/install-self-hosted/faqs.md
rename to en/getting-started/install-self-hosted/faqs.mdx
index 23a94b42..deef7567 100644
--- a/en-us/getting-started/install-self-hosted/faqs.md
+++ b/en/getting-started/install-self-hosted/faqs.mdx
@@ -1,8 +1,11 @@
-# FAQs
+---
+title: FAQs
+---
+
### 1. Not receiving reset password emails
-You need to configure the `Mail` parameters in the `.env` file. For detailed instructions, please refer to ["Environment Variables Explanation: Mail-related configuration"](https://docs.dify.ai/getting-started/install-self-hosted/environments#mail-related-configuration).
+You need to configure the `Mail``.env`eters in the `.env` file. For detailed instructions, please refer to ["Environment Variables Explanation: Mail-related configuration"](https://docs.dify.ai/getting-started/install-self-hosted/environments#mail-related-configuration).
After modifying the configuration, run the following commands to restart the service:
@@ -15,23 +18,24 @@ If you still haven't received the email, please check if the email service is wo
### 2. How to handle if the workflow is too complex and exceeds the node limit?
-In the community edition, you can manually adjust the MAX\_TREE\_DEPTH limit for single branch depth in `web/app/components/workflow/constants.ts.` Our default value is 50, and it's important to note that excessively deep branches may affect performance in self-hosted scenarios.
+In the community edition, you can manually adjust the MAX\_TREE\_D`web/app/components/workflow/constants.ts.`app/components/workflow/constants.ts.` Our default value is 50, and it's important to note that excessively deep branches may affect performance in self-hosted scenarios.
### 3. How to specify the runtime for each workflow node?
-
-You can modify the `TEXT_GENERATION_TIMEOUT_MS` variable in the `.env` file to adjust the runtime for each node. This helps prevent overall application service unavailability caused by certain processes timing out.
+`TEXT_GENERATION_TIMEOUT_MS`NERATION_TIMEOUT_MS``.env`ble in the `.env` file to adjust the runtime for each node. This helps prevent overall application service unavailability caused by certain processes timing out.
### 4. How to reset the password of the admin account?
-If you deployed using Docker Compose, you can reset the password with the following command while your Docker Compose is running:
-
-```
+If you deployed using Docker Compose, you can reset the password with the following command while`
+docker exec -it docker-a```
docker exec -it docker-api-1 flask reset-password
-```
+```r the email address and the new password. Example:
-It will prompt you to enter the email address and the new password. Example:
+`ss and the new password. Example:
```
+dify@my-pc:~/hello/dify/docker$ docker c`
+dify@my-pc:~/hello/dify/docker$ docker compose up -d
+[+] ```
dify@my-pc:~/hello/dify/docker$ docker compose up -d
[+] Running 9/9
✔ Container docker-web-1 Started 0.1s
@@ -51,13 +55,7 @@ Email: hello@dify.ai
New password: newpassword4567
Password confirm: newpassword4567
Password reset successfully.
-```
-
-### 5. How to Change the Port
-
-If you're using Docker Compose, you can customize the access port by modifying the `.env` configuration file.
-
-You need to modify the Nginx configuration:
+```se, you can customize the access port by modifying the `ify the Nginx configuration:
```json
EXPOSE_NGINX_PORT=80
@@ -65,4 +63,14 @@ EXPOSE_NGINX_SSL_PORT=443
```
-Other self-host issue please check this document [Self-Host Related](../../learn-more/faq/install-faq.md)。
\ No newline at end of file
+Other self-host issue pleas` configuration file.
+
+You need to modify the Nginx configuration:
+
+`l-faq.md)。```json
+EXPOSE_NGINX_PORT=80
+EXPOSE_NGINX_SSL_PORT=443
+````json
+EXPOSE_NGINX_PORT=80
+EXPOSE_NGINX_SSL_PORT=443
+`
\ No newline at end of file
diff --git a/en/getting-started/install-self-hosted/install-faq.mdx b/en/getting-started/install-self-hosted/install-faq.mdx
new file mode 100644
index 00000000..f0ccc5ea
--- /dev/null
+++ b/en/getting-started/install-self-hosted/install-faq.mdx
@@ -0,0 +1,347 @@
+---
+title: FAQ
+---
+
+
+### 1. How to reset the password if the local deployment initialization fails with an incorrect password?
+
+If deployed using docker compose, you can execute the following command to reset the password: `docker exec -it docker-api-1 flask reset-password` Enter the account email and twice new passwords, and it will be reset.
+
+### 2. How to resolve File not found error in the log when deploying locally?
+
+```
+ERROR:root:Unknown Error in completion
+Traceback (most recent call last):
+ File "/www/wwwroot/dify/dify/api/libs/rsa.py", line 45, in decrypt
+ private_key = storage.load(filepath)
+ File "/www/wwwroot/dify/dify/api/extensions/ext_storage.py", line 65, in load
+ raise FileNotFoundError("File not found")
+FileNotFoundError: File not found
+```
+
+This error may be caused by switching deployment methods, or deleting the `api/storage/privkeys` fil`api/storage/privkeys`crypt large model keys and can not be reversed if lost. You can reset the encryption public and private keys with the following command:
+
+* Docker compose deployment
+
+```
+docker exec -it docke`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`up
+
+Enter the api directory
+
+```
+flas`
+
+* Source code startup
+
+Enter the api directory
+
+`reset.
+
+### 3. Unable to log`
+flask reset-encrypt-key-pair
+`hen logi`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS` `WEB_API_CORS_ALLOW_ORIGINS` WebAPP CORS cross-domain strategy, default to `*`, which allows access from all domain names.
+
+### 4. After starting, the page keeps loading and checking the request prompts CORS error?
+
+This may be because the domain name/URL has been switched, resulting in cross-domain between the front end and the back end. Please ch`*`e all the following con`WEB_API_CORS_ALLOW_ORIGINS`compose.yml` to the new domain name: `CONSOLE_API_`*`:` The backend URL of the console API. `CONSOLE_WEB_URL:` The front-end URL of the console web. `SERVICE_API_URL:` Service API Url `APP_API_URL:` WebApp API backend Url. `APP_WEB_URL:` WebApp Url.
+
+For more information, please check out: [Environments](environments.md)
+
+### 5. How to upgrade ver`docker-compose.yml`t?
+
+If you start up `api/storage/privkeys`0ease pull the latest images t`api/storage/privkeys`1rade. If you start up through s`api/storage/privkeys`2 pull the`api/storage/privkeys`3nd then start`api/storage/privkeys`4e the upgrade.
+
+When deploying and updating local source code, you need to enter the API directory and execute the following command to migrate the database structure to the latest version:
+
+`flask db upgrade`
+
+### 6.How to configure the environment variables when use Notion import
+
+**Q: What is the Notion's Integration configuration address?**
+
+A: [https://www.notion.so/my-integrations](https://www.notion.so/my-integrations)
+
+**Q: Which environment variables need to be configured?**
+
+A: Pl`api/storage/privkeys`5figuration when doing the privatized deployment
+
+1. **`NOTION_INTEGRATION_TYPE`** : The value should configrate as (**public/internal**). Since the Redirect address of Notion’s Oauth only supports https, if it is deployed locally, please use Notion’s internal integration
+2. **`NOTION_CLIENT_SECRET`** : Notion OAuth client secret (userd for public i`api/storage/privkeys`6TION_CLIENT_ID`** : OAuth client ID (userd for public integration type)
+4. **`NOTION_INTERNAL_SECRET`** : Notion Internal Integration Secret, If the value of `NOTION_INTEGRATION_TYPE` is **internal`api/storage/privkeys`7ure this variable.
+
+### 7. How to change the name of the space in the`api/storage/privkeys`8version?
+
+Modify in the `tenants` table in the databas`api/storage/privkeys`9odify the domain name for accessing the application?
+
+F`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`0ain name APP\_WEB\_URL in `docker_compose.yaml`.
+
+### 9. If database migration is required, what things need to be backed up?
+
+The database, confi`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`1rage, and vector database data need to be backed up. If deployed in Docker Compose mode, all data content in the `dify/docker/volumes``
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`2ectly backed up.
+
+### 10. Why is Docker deploying Dify and starting OpenLLM locally using 127.0.0.1, but unable to access the local port?
+
+`127.0.0.1` is the internal address of the container, and the server address `
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`3quires the host LAN IP address.
+
+### 11. How to solve the size and quantity limitations for uploading knowledge documents in the local deployment version?
+`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`4fer to the official website environment variable description document to configure:
+
+[Environments](environments.md)
+
+### 12. How does the local deployment edition invite members through email?
+
+Local deployment edition, members can be invited through email. After entering the email invitation, the page displays the invitation link, copies the invitation link, and forwards it to users. Your team members can open the link and log in to your space by setting a password through email login.
+
+### 13. How to solve listen tcp4 0.0.0.0:80: bind: address already in use?
+
+This is because the port is occupied. You can use the `netstat -tunlp | grep 80` command to view the process that occupies the port, and then kill the process. For example, the apache and nginx processes occupy the port, you can use the `service apache2 stop` and `service nginx stop` command`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`5# 14. What to do if this error occurs in text-to-speech?
+
+```
+[openai] Error: ffmpeg is not installed
+```
+
+Since OpenAI TTS has implemented audio stream segm`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`6 `
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`7normal use when deploying the source code. Here are the detailed steps:
+
+**Windows:**
+
+1`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`8://ffmpeg.org/down`
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+`9epel-release`
+2. Then, enter: `sudo rpm -Uvh http://li.nux.ro/download/nux/dextop/el7/x86_64/nux-dextop-release-0-5.el7.nux.noarch.rpm`
+3. Update the yum package, enter: `sudo yum update`
+4. Finally, install FFmpeg, enter: `sudo yum install ffmpeg ffmpeg-devel`
+5. Enter "ffmpeg -version" to check if it has been successfully installed.
+
+**Mac OS X:**
+
+1. Open the terminal.
+2. If you haven't installed Homebrew yet, you can install it by entering the following command in the terminal: `/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"`
+3. Install FFmpeg with Homebrew, enter: `brew install ffmpeg`
+4. Enter "ffmpeg -version" to check if it has been successfully installed.
+
+### 15. Migrate Vector Database to Another Vector Database
+
+If you want to migrate the vec`
+
+* Source code startup
+
+Enter the api directory
+
+`0om weaviate to a`
+
+* Source code startup
+
+Enter the api directory
+
+`1de, modify the environment variable in the `.env` file to the vector database you want to migrate to. etc: `VECTOR_STORE=qdrant`
+2. If you are starting from docker-compos`
+
+* Source code startup
+
+Enter the api directory
+
+`2onment variable in the `docker-compose.yaml` file to the vector database you want to migrate to,`
+
+* Source code startup
+
+Enter the api directory
+
+`3etc:
+
+```
+# The type of ve`
+
+* Source code startup
+
+Enter the api directory
+
+`4e `weaviate`, `qdrant`, `milvus`, `analyticdb`.
+`
+
+* Source code startup
+
+Enter the api directory
+
+`5b
+
+### 16. Why is SSRF_PROXY Needed?
+
+You may have noticed the `SSRF_```
+# The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`, `analyticdb`.
+VECTOR_STORE: weaviate
+```` to prevent Server-Side Request Forgery (SSRF) attacks. For more details on SSRF attacks, refer to [this resource](https://portswigger.net/web-security/ssrf).
+
+To reduce po```
+flask vdb-migrate # or docker exec -it docker-api-1 flask vdb-migrate
+```le to S`
+
+* Source code startup
+
+Enter the api directory
+
+`6es like Sandbox can only access exte`
+
+* Source code startup
+
+Enter the api directory
+
+`7the `192.168.101.0/24` network to be accessed by the proxy, but restrict access to an IP address `192.168.101.19` that contains sensitive data, you can add the following rules to `squid.conf`:
+
+```plaintext
+acl restricted_ip dst 192.168.101.19
+acl localnet src 192.168.101.0/24
+
+http_access deny restricted_ip
+http_access allow localnet
+http_access deny all
+```
+
+This is a basic example, `
+
+* Source code startup
+
+Enter the api directory
+
+`8For more information about configuring `squid`, refer`
+
+* Source code startup
+
+Enter the api directory
+
+`9PLACEHOLDER_6`
+flask reset-encrypt-key-pair
+`0`
+flask reset-encrypt-key-pair
+`1`
+flask reset-encrypt-key-pair
+`2`
+flask reset-encrypt-key-pair
+`3`
+flask reset-encrypt-key-pair
+`4`
+flask reset-encrypt-key-pair
+`5`
+flask reset-encrypt-key-pair
+`6`
+flask reset-encrypt-key-pair
+`7`
+flask reset-encrypt-key-pair
+`8`
+flask reset-encrypt-key-pair
+`9`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`0`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`1`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`2`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`3`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`4`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`5`
+
+Follow the prompts to reset.
+
+### 3. Unable to log in when installing later, and then login is successful but subsequent interfaces prompt 401?
+
+This may be due to switching ```
+docker exec -it docker-api-1 flask reset-encrypt-key-pair
+```nd server-side. Cross-domain and identity involve two configuration items:
+
+**CORS cross-do```
+flask reset-encrypt-key-pair
+```ALLOW_ORIGINS`6
\ No newline at end of file
diff --git a/en/getting-started/install-self-hosted/local-source-code.mdx b/en/getting-started/install-self-hosted/local-source-code.mdx
new file mode 100644
index 00000000..b279017e
--- /dev/null
+++ b/en/getting-started/install-self-hosted/local-source-code.mdx
@@ -0,0 +1,337 @@
+---
+title: Local Source Code Start
+---
+
+
+## Prerequisites
+
+> Before installing Dify, make sure your machine meets the following minimum system requirements:
+> - CPU >= 2 Core
+> - RAM >= 4 GiB
+
+
+
+
+
Operating System
+
Software
+
Explanation
+
+
+
+
+
macOS 10.14 or later
+
Docker Desktop
+
Set the Docker virtual machine (VM) to use a minimum of 2 virtual CPUs (vCPUs) and 8 GB of initial memory. Otherwise, the installation may fail. For more information, please refer to the Docker Desktop installation guide for Mac.
+> If you need to use OpenAI TTS, `FFmpeg` must be installed on the system for it to function properly. For more details, refer to: [Link](https://docs.dify.ai/getting-started/install-self-hosted/install-faq#id-14.-what-to-do-if-this-error-occurs-in-text-to-speech).
+
+### Clone Dify
+
+```Bash
+git clone https://github.com/langgenius/dify.git
+```
+
+Before enabling business services, we need to first deploy PostgreSQL / Redis / Weaviate (if not locally available). We can start them with the follow`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`pyenv install.
+
+```Bash
+pyenv install 3.12
+```
+
+To switch to the "3.12" Python environment, use the following command:
+
+```Bash
+pyenv global 3.12```Bash
+pyenv install 3.12
+```:
+
+1. Navigate to the "api" directory:
+
+ ```
+ cd api
+ ```
+
+> For macOS```Bash
+pyenv global 3.12
+``` install libmagic`.
+
+1. Copy the environment variable configuration file:
+
+CODE_BLOCK`Bash
+pyenv install 3.12
+` `
+
+To switch to the "3.12" Python environment, use the following command:
+
+`n the .env file:
+
+ ```
+ CODE_BLOCK_PLACEHOLDER`Bash
+pyenv global 3.12```Bash
+pyenv install 3.12
+```:
+
+1. Navigate to the "api" directory:
+
+ `. Install the ```
+ awk -v key="$(openssl rand -base64 42)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
+ ````poetry shell` to activate theINLINE_CODE_PLACEHO`
+
+> For macOS```Bash
+pyenv global 3.12
+``` install libmagic` ```
+
+4. Perform the database migration:
+
+ Perform database migration to the latest version:
+
+ ```
+ poetry shell
+ fl`
+
+2. Generate a random secret key and replace the value of SECRET_KEY in the .env file:
+
+ `OLDER_7 ```
+ * Debug mode: on
+ INFO:werkzeug:WARNING: This is a de`
+ ```
+ cp .env.example .env
+ ```2)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
+ ` * Running on http://127.0```
+ flask run --host 0.0.0.0 --port=5001 --debug
+ ```rkzeug: * Restarting with stat
+ WARN`
+
+3. Install the ```
+ awk -v key="$(openssl rand -base64 42)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
+ ````ws system to start the Worker s`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`0and instead:`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`1--without-gossip --withou`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`2----------- celery@TAKATOST.lan v5.2.7 (dawn-chorus)
+ --- ***** -----
+ `Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`3it 2023-07-31 12:58:`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`4------- [c`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`5x7fb568572a10
+ - ** `Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`6k.add_document_to_index_task
+ . tasks.clean_dataset_task.clean_dataset_ta```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```3
+
+ ```
+ # For production release, change this to PRODUCTION
+ NEXT_PUBLIC_DEPLOY_ENV=DEVELOPMENT
+ # The deployment edition, SELF_HOSTED or CLOUD
+ NEXT_PUBLIC_EDITION=SELF_HOSTED
+ # The base URL of console application, refers to the Console base URL of WEB `Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`7IC_API_PREFIX=http://localhost:5001/console/api
+ # The URL for Web APP, refers to the Web App base URL of INLINE_CODE_PLACEHOLDE`Bash
+cd docker
+cp middleware.env.```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```Interface Service
+- Worker Asynchronous Queue Consumption Service
+
+#### Installation of the basic environment:
+
+Server startup requires Python 3.12. It is recommended to use [pyenv](https://github.com/pyenv/pyenv) for quick installation of the Python environment.
+
+To install additional Python versions, use pyenv install.
+
+`9your own risk.
+info - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback
+```
+
+### Access Dify
+
+Finally, access [http://127.0.0.1:3000](http://127.0.0.1:3000/) to use the locally deployed Dify.
+```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```4```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```5```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```6```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```7```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```8```Bash
+cd docker
+cp middleware.env.example middleware.env
+docker compose -f docker-compose.middleware.yaml up -d
+```9`Bash
+pyenv install 3.12
+`0`Bash
+pyenv install 3.12
+`1`Bash
+pyenv install 3.12
+`2`Bash
+pyenv install 3.12
+`3`Bash
+pyenv install 3.12
+`4`Bash
+pyenv install 3.12
+`5`Bash
+pyenv install 3.12
+`6`Bash
+pyenv install 3.12
+`7
\ No newline at end of file
diff --git a/en-us/getting-started/install-self-hosted/README.md b/en/getting-started/install-self-hosted/readme.mdx
similarity index 96%
rename from en-us/getting-started/install-self-hosted/README.md
rename to en/getting-started/install-self-hosted/readme.mdx
index 789bf834..e81375e8 100644
--- a/en-us/getting-started/install-self-hosted/README.md
+++ b/en/getting-started/install-self-hosted/readme.mdx
@@ -1,4 +1,7 @@
-# Dify Community
+---
+title: Introduction
+---
+
Dify, an open-source project on [GitHub](https://github.com/langgenius/dify), can be self-hosted using either of these methods:
diff --git a/en-us/getting-started/install-self-hosted/start-the-frontend-docker-container.md b/en/getting-started/install-self-hosted/start-the-frontend-docker-container.mdx
similarity index 63%
rename from en-us/getting-started/install-self-hosted/start-the-frontend-docker-container.md
rename to en/getting-started/install-self-hosted/start-the-frontend-docker-container.mdx
index 3e9a8a03..f0ce7313 100644
--- a/en-us/getting-started/install-self-hosted/start-the-frontend-docker-container.md
+++ b/en/getting-started/install-self-hosted/start-the-frontend-docker-container.mdx
@@ -1,4 +1,7 @@
-# Start the frontend Docker container separately
+---
+title: Start the frontend Docker container separately
+---
+
When developing the backend separately, you may only need to start the backend service from source code without building and launching the frontend locally. In this case, you can directly start the frontend service by pulling the Docker image and running the container. Here are the specific steps:
@@ -16,11 +19,12 @@ docker run -it -p 3000:3000 -e CONSOLE_URL=http://127.0.0.1:5001 -e APP_URL=http
cd web && docker build . -t dify-web
```
+2. Start the fronte`
+
2. Start the frontend image
- ```
- docker run -it -p 3000:3000 -e CONSOLE_URL=http://127.0.0.1:5001 -e APP_URL=http://127.0.0.1:5001 dify-web
- ```
-
-3. When the console domain and web app domain are different, you can set the CONSOLE_URL and APP_URL separately
-4. To access it locally, you can visit [http://127.0.0.1:3000](http://127.0.0.1:3000/)
+ `CODE_BLOCK_PLA`
+ docker run -it -```
+ cd web && docker build . -t dify-web
+ ```APP_URL=http://127.0.0.1:5001 dify-web
+ `HOLDER_2 you can visit [http://127.0.0.1:3000](http://127.0.0.1:3000/)
diff --git a/en/getting-started/readme/features-and-specifications.mdx b/en/getting-started/readme/features-and-specifications.mdx
new file mode 100644
index 00000000..9fdf9854
--- /dev/null
+++ b/en/getting-started/readme/features-and-specifications.mdx
@@ -0,0 +1,210 @@
+---
+title: Features and Specifications
+description: For those already familiar with LLM application tech stacks, this document serves as a shortcut to understand Dify's unique advantages
+---
+
+
+We adopt transparent policies around product specifications to ensure decisions are made based on complete understanding. Such transparency not only benefits your technical selection, but also promotes deeper comprehension within the community for active contributions.
+
+### Project Basics
+
+
OpenAI Interface Standard Model Integration Supported
+
∞
+
+
+
Multimodal Capabilities
+
+
ASR Models
+
Rich-text models up to GPT-4o specs
+
+
+
+
Built-in App Types
+
Text generation, Chatbot, Agent, Workflow, Chatflow
+
+
+
Prompt-as-a-Service Orchestration
+
+
Visual orchestration interface widely praised, modify Prompts and preview effects in one place.
+
Orchestration Modes
+
+
Simple orchestration
+
Assistant orchestration
+
Flow orchestration
+
+
Prompt Variable Types
+
+
String
+
Radio enum
+
External API
+
File (Q3 2024)
+
+
+
+
+
Agentic Workflow Features
+
+
Industry-leading visual workflow orchestration interface, live-editing node debugging, modular DSL, and native code runtime, designed for building more complex, reliable, and stable LLM applications.
+
Supported Nodes
+
+
LLM
+
Knowledge Retrieval
+
Question Classifier
+
IF/ELSE
+
CODE
+
Template
+
HTTP Request
+
Tool
+
+
+
+
+
RAG Features
+
+
Industry-first visual knowledge base management interface, supporting snippet previews and recall testing.
+
Indexing Methods
+
+
Keywords
+
Text vectors
+
LLM-assisted question-snippet model
+
+
Retrieval Methods
+
+
Keywords
+
Text similarity matching
+
Hybrid Search
+
N choose 1 (Legacy)
+
Multi-path retrieval
+
+
Recall Optimization
+
+
Rerank models
+
+
+
+
+
ETL Capabilities
+
+
Automated cleaning for TXT, Markdown, PDF, HTML, DOC, CSV formats. Unstructured service enables maximum support.
Based on human-annotated Q&As, used for similarity-based replies. Exportable as data format for model fine-tuning.
+
+
+
Content Moderation
+
OpenAI Moderation or external APIs
+
+
+
Team Collaboration
+
Workspaces, multi-member management
+
+
+
API Specs
+
RESTful, most features covered
+
+
+
Deployment Methods
+
Docker, Helm
+
+
+
diff --git a/en/getting-started/readme/model-providers.mdx b/en/getting-started/readme/model-providers.mdx
new file mode 100644
index 00000000..602b9bff
--- /dev/null
+++ b/en/getting-started/readme/model-providers.mdx
@@ -0,0 +1,394 @@
+---
+title: List of Model Providers
+---
+
+Dify supports the below model providers out-of-box:
+
+
+
+
+
Provider
+
LLM
+
Text Embedding
+
Rerank
+
Speech to text
+
TTS
+
+
+
+
+
OpenAI
+
✔️(🛠️)(👓)
+
✔️
+
+
✔️
+
✔️
+
+
+
Anthropic
+
✔️(🛠️)
+
+
+
+
+
+
+
Azure OpenAI
+
✔️(🛠️)(👓)
+
✔️
+
+
✔️
+
✔️
+
+
+
Gemini
+
✔️
+
+
+
+
+
+
+
Google Cloud
+
✔️(👓)
+
✔️
+
+
+
+
+
+
Nvidia API Catalog
+
✔️
+
✔️
+
✔️
+
+
+
+
+
Nvidia NIM
+
✔️
+
+
+
+
+
+
+
Nvidia Triton Inference Server
+
✔️
+
+
+
+
+
+
+
AWS Bedrock
+
✔️
+
✔️
+
+
+
+
+
+
OpenRouter
+
✔️
+
+
+
+
+
+
+
Cohere
+
✔️
+
✔️
+
✔️
+
+
+
+
+
together.ai
+
✔️
+
+
+
+
+
+
+
Ollama
+
✔️
+
✔️
+
+
+
+
+
+
Mistral AI
+
✔️
+
+
+
+
+
+
+
groqcloud
+
✔️
+
+
+
+
+
+
+
Replicate
+
✔️
+
✔️
+
+
+
+
+
+
Hugging Face
+
✔️
+
✔️
+
+
+
+
+
+
Xorbits inference
+
✔️
+
✔️
+
✔️
+
✔️
+
✔️
+
+
+
Zhipu AI
+
✔️(🛠️)(👓)
+
✔️
+
+
+
+
+
+
Baichuan
+
✔️
+
✔️
+
+
+
+
+
+
Spark
+
✔️
+
+
+
+
+
+
+
Minimax
+
✔️(🛠️)
+
✔️
+
+
+
+
+
+
Tongyi
+
✔️
+
✔️
+
+
+
✔️
+
+
+
Wenxin
+
✔️
+
✔️
+
+
+
+
+
+
Moonshot AI
+
✔️(🛠️)
+
+
+
+
+
+
+
Tencent Cloud
+
+
+
+
✔️
+
+
+
+
Stepfun
+
✔️(🛠️)(👓)
+
+
+
+
+
+
+
VolcanoEngine
+
✔️
+
✔️
+
+
+
+
+
+
01.AI
+
✔️
+
+
+
+
+
+
+
360 Zhinao
+
✔️
+
+
+
+
+
+
+
Azure AI Studio
+
✔️
+
+
✔️
+
+
+
+
+
deepseek
+
✔️(🛠️)
+
+
+
+
+
+
+
Tencent Hunyuan
+
✔️
+
+
+
+
+
+
+
SILICONFLOW
+
✔️
+
✔️
+
+
+
+
+
+
Jina AI
+
+
✔️
+
✔️
+
+
+
+
+
ChatGLM
+
✔️
+
+
+
+
+
+
+
Xinference
+
✔️(🛠️)(👓)
+
✔️
+
✔️
+
+
+
+
+
OpenLLM
+
✔️
+
✔️
+
+
+
+
+
+
LocalAI
+
✔️
+
✔️
+
✔️
+
✔️
+
+
+
+
OpenAI API-Compatible
+
✔️
+
✔️
+
+
✔️
+
+
+
+
PerfXCloud
+
✔️
+
✔️
+
+
+
+
+
+
Lepton AI
+
✔️
+
+
+
+
+
+
+
novita.ai
+
✔️
+
+
+
+
+
+
+
Amazon Sagemaker
+
✔️
+
✔️
+
✔️
+
+
+
+
+
Text Embedding Inference
+
+
✔️
+
✔️
+
+
+
+
+
GPUStack
+
✔️(🛠️)(👓)
+
✔️
+
✔️
+
+
+
+
+
+
+where (🛠️) ︎ denotes "function calling" and (👓) denotes "support for vision".
+
+---
+
+This table is continuously updated. We also keep track of model providers requested by community members [here](https://github.com/langgenius/dify/discussions/categories/ideas). If you'd like to see a model provider not listed above, please consider contributing by making a PR. To learn more, check out our [contribution.md](../../community/contribution.md "mention") Guide.
diff --git a/en-us/introduction.mdx b/en/introduction.mdx
similarity index 99%
rename from en-us/introduction.mdx
rename to en/introduction.mdx
index 2b1be80f..5c65f307 100644
--- a/en-us/introduction.mdx
+++ b/en/introduction.mdx
@@ -1,5 +1,5 @@
---
-title: Welcome to Dify
+title: Introduction
description: "Welcome to the home of your new documentation"
---
diff --git a/en-us/management/app-management.mdx b/en/management/app-management.mdx
similarity index 100%
rename from en-us/management/app-management.mdx
rename to en/management/app-management.mdx
diff --git a/en-us/management/personal-account-management.mdx b/en/management/personal-account-management.mdx
similarity index 100%
rename from en-us/management/personal-account-management.mdx
rename to en/management/personal-account-management.mdx
diff --git a/en-us/management/subscription-management.mdx b/en/management/subscription-management.mdx
similarity index 100%
rename from en-us/management/subscription-management.mdx
rename to en/management/subscription-management.mdx
diff --git a/en-us/management/team-members-management.mdx b/en/management/team-members-management.mdx
similarity index 100%
rename from en-us/management/team-members-management.mdx
rename to en/management/team-members-management.mdx
diff --git a/en-us/management/version-control.mdx b/en/management/version-control.mdx
similarity index 100%
rename from en-us/management/version-control.mdx
rename to en/management/version-control.mdx
diff --git a/en-us/user-guide/build-app/agent.mdx b/en/user-guide/build-app/agent.mdx
similarity index 100%
rename from en-us/user-guide/build-app/agent.mdx
rename to en/user-guide/build-app/agent.mdx
diff --git a/en-us/user-guide/build-app/chatbot.mdx b/en/user-guide/build-app/chatbot.mdx
similarity index 100%
rename from en-us/user-guide/build-app/chatbot.mdx
rename to en/user-guide/build-app/chatbot.mdx
diff --git a/en-us/user-guide/build-app/flow-app/additional-features.mdx b/en/user-guide/build-app/flow-app/additional-features.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/additional-features.mdx
rename to en/user-guide/build-app/flow-app/additional-features.mdx
diff --git a/en-us/user-guide/build-app/flow-app/application-publishing.mdx b/en/user-guide/build-app/flow-app/application-publishing.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/application-publishing.mdx
rename to en/user-guide/build-app/flow-app/application-publishing.mdx
diff --git a/en-us/user-guide/build-app/flow-app/concepts.mdx b/en/user-guide/build-app/flow-app/concepts.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/concepts.mdx
rename to en/user-guide/build-app/flow-app/concepts.mdx
diff --git a/en-us/user-guide/build-app/flow-app/create-flow-app.mdx b/en/user-guide/build-app/flow-app/create-flow-app.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/create-flow-app.mdx
rename to en/user-guide/build-app/flow-app/create-flow-app.mdx
diff --git a/en-us/user-guide/build-app/flow-app/file-upload.mdx b/en/user-guide/build-app/flow-app/file-upload.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/file-upload.mdx
rename to en/user-guide/build-app/flow-app/file-upload.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/README.md b/en/user-guide/build-app/flow-app/nodes/README.md
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/README.md
rename to en/user-guide/build-app/flow-app/nodes/README.md
diff --git a/en-us/user-guide/build-app/flow-app/nodes/answer.mdx b/en/user-guide/build-app/flow-app/nodes/answer.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/answer.mdx
rename to en/user-guide/build-app/flow-app/nodes/answer.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/code.mdx b/en/user-guide/build-app/flow-app/nodes/code.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/code.mdx
rename to en/user-guide/build-app/flow-app/nodes/code.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/doc-extractor.mdx b/en/user-guide/build-app/flow-app/nodes/doc-extractor.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/doc-extractor.mdx
rename to en/user-guide/build-app/flow-app/nodes/doc-extractor.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/end.mdx b/en/user-guide/build-app/flow-app/nodes/end.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/end.mdx
rename to en/user-guide/build-app/flow-app/nodes/end.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/http-request.mdx b/en/user-guide/build-app/flow-app/nodes/http-request.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/http-request.mdx
rename to en/user-guide/build-app/flow-app/nodes/http-request.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/ifelse.mdx b/en/user-guide/build-app/flow-app/nodes/ifelse.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/ifelse.mdx
rename to en/user-guide/build-app/flow-app/nodes/ifelse.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/iteration.mdx b/en/user-guide/build-app/flow-app/nodes/iteration.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/iteration.mdx
rename to en/user-guide/build-app/flow-app/nodes/iteration.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/knowledge-retrieval.mdx b/en/user-guide/build-app/flow-app/nodes/knowledge-retrieval.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/knowledge-retrieval.mdx
rename to en/user-guide/build-app/flow-app/nodes/knowledge-retrieval.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/list-operator.mdx b/en/user-guide/build-app/flow-app/nodes/list-operator.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/list-operator.mdx
rename to en/user-guide/build-app/flow-app/nodes/list-operator.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/llm.mdx b/en/user-guide/build-app/flow-app/nodes/llm.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/llm.mdx
rename to en/user-guide/build-app/flow-app/nodes/llm.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/parameter-extractor.mdx b/en/user-guide/build-app/flow-app/nodes/parameter-extractor.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/parameter-extractor.mdx
rename to en/user-guide/build-app/flow-app/nodes/parameter-extractor.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/question-classifier.mdx b/en/user-guide/build-app/flow-app/nodes/question-classifier.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/question-classifier.mdx
rename to en/user-guide/build-app/flow-app/nodes/question-classifier.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/start.mdx b/en/user-guide/build-app/flow-app/nodes/start.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/start.mdx
rename to en/user-guide/build-app/flow-app/nodes/start.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/template.mdx b/en/user-guide/build-app/flow-app/nodes/template.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/template.mdx
rename to en/user-guide/build-app/flow-app/nodes/template.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/tools.mdx b/en/user-guide/build-app/flow-app/nodes/tools.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/tools.mdx
rename to en/user-guide/build-app/flow-app/nodes/tools.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/variable-aggregator.mdx b/en/user-guide/build-app/flow-app/nodes/variable-aggregator.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/variable-aggregator.mdx
rename to en/user-guide/build-app/flow-app/nodes/variable-aggregator.mdx
diff --git a/en-us/user-guide/build-app/flow-app/nodes/variable-assigner.mdx b/en/user-guide/build-app/flow-app/nodes/variable-assigner.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/nodes/variable-assigner.mdx
rename to en/user-guide/build-app/flow-app/nodes/variable-assigner.mdx
diff --git a/en-us/user-guide/build-app/flow-app/orchestrate-node.mdx b/en/user-guide/build-app/flow-app/orchestrate-node.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/orchestrate-node.mdx
rename to en/user-guide/build-app/flow-app/orchestrate-node.mdx
diff --git a/en-us/user-guide/build-app/flow-app/shotcut-key.mdx b/en/user-guide/build-app/flow-app/shotcut-key.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/shotcut-key.mdx
rename to en/user-guide/build-app/flow-app/shotcut-key.mdx
diff --git a/en-us/user-guide/build-app/flow-app/variables.mdx b/en/user-guide/build-app/flow-app/variables.mdx
similarity index 100%
rename from en-us/user-guide/build-app/flow-app/variables.mdx
rename to en/user-guide/build-app/flow-app/variables.mdx
diff --git a/en-us/user-guide/build-app/text-generator.mdx b/en/user-guide/build-app/text-generator.mdx
similarity index 100%
rename from en-us/user-guide/build-app/text-generator.mdx
rename to en/user-guide/build-app/text-generator.mdx
diff --git a/en-us/user-guide/knowledge-base/api-documentation/external-knowledge-api-documentation.mdx b/en/user-guide/knowledge-base/api-documentation/external-knowledge-api-documentation.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/api-documentation/external-knowledge-api-documentation.mdx
rename to en/user-guide/knowledge-base/api-documentation/external-knowledge-api-documentation.mdx
diff --git a/en-us/user-guide/knowledge-base/api-documentation/external-knowledge-api.mdx b/en/user-guide/knowledge-base/api-documentation/external-knowledge-api.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/api-documentation/external-knowledge-api.mdx
rename to en/user-guide/knowledge-base/api-documentation/external-knowledge-api.mdx
diff --git a/en-us/user-guide/knowledge-base/connect-external-knowledge-base.mdx b/en/user-guide/knowledge-base/connect-external-knowledge-base.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/connect-external-knowledge-base.mdx
rename to en/user-guide/knowledge-base/connect-external-knowledge-base.mdx
diff --git a/en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/chunking-and-cleaning-text.mdx b/en/user-guide/knowledge-base/create-knowledge-and-upload-documents/chunking-and-cleaning-text.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/chunking-and-cleaning-text.mdx
rename to en/user-guide/knowledge-base/create-knowledge-and-upload-documents/chunking-and-cleaning-text.mdx
diff --git a/en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/readme.mdx b/en/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/readme.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/readme.mdx
rename to en/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/readme.mdx
diff --git a/en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-notion.mdx b/en/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-notion.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-notion.mdx
rename to en/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-notion.mdx
diff --git a/en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-website.mdx b/en/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-website.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-website.mdx
rename to en/user-guide/knowledge-base/create-knowledge-and-upload-documents/import-content-data/sync-from-website.mdx
diff --git a/en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/readme.mdx b/en/user-guide/knowledge-base/create-knowledge-and-upload-documents/readme.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/readme.mdx
rename to en/user-guide/knowledge-base/create-knowledge-and-upload-documents/readme.mdx
diff --git a/en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/setting-indexing-methods.mdx b/en/user-guide/knowledge-base/create-knowledge-and-upload-documents/setting-indexing-methods.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/create-knowledge-and-upload-documents/setting-indexing-methods.mdx
rename to en/user-guide/knowledge-base/create-knowledge-and-upload-documents/setting-indexing-methods.mdx
diff --git a/en-us/user-guide/knowledge-base/external-knowledge-api.mdx b/en/user-guide/knowledge-base/external-knowledge-api.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/external-knowledge-api.mdx
rename to en/user-guide/knowledge-base/external-knowledge-api.mdx
diff --git a/en-us/user-guide/knowledge-base/faq.mdx b/en/user-guide/knowledge-base/faq.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/faq.mdx
rename to en/user-guide/knowledge-base/faq.mdx
diff --git a/en-us/user-guide/knowledge-base/indexing-and-retrieval/hybrid-search.mdx b/en/user-guide/knowledge-base/indexing-and-retrieval/hybrid-search.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/indexing-and-retrieval/hybrid-search.mdx
rename to en/user-guide/knowledge-base/indexing-and-retrieval/hybrid-search.mdx
diff --git a/en-us/user-guide/knowledge-base/indexing-and-retrieval/rerank.mdx b/en/user-guide/knowledge-base/indexing-and-retrieval/rerank.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/indexing-and-retrieval/rerank.mdx
rename to en/user-guide/knowledge-base/indexing-and-retrieval/rerank.mdx
diff --git a/en-us/user-guide/knowledge-base/indexing-and-retrieval/retrieval-augment.mdx b/en/user-guide/knowledge-base/indexing-and-retrieval/retrieval-augment.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/indexing-and-retrieval/retrieval-augment.mdx
rename to en/user-guide/knowledge-base/indexing-and-retrieval/retrieval-augment.mdx
diff --git a/en-us/user-guide/knowledge-base/indexing-and-retrieval/retrieval.mdx b/en/user-guide/knowledge-base/indexing-and-retrieval/retrieval.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/indexing-and-retrieval/retrieval.mdx
rename to en/user-guide/knowledge-base/indexing-and-retrieval/retrieval.mdx
diff --git a/en-us/user-guide/knowledge-base/integrate-knowledge-within-application.mdx b/en/user-guide/knowledge-base/integrate-knowledge-within-application.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/integrate-knowledge-within-application.mdx
rename to en/user-guide/knowledge-base/integrate-knowledge-within-application.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/external-knowledge-api.mdx b/en/user-guide/knowledge-base/knowledge-and-documents-maintenance/external-knowledge-api.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/external-knowledge-api.mdx
rename to en/user-guide/knowledge-base/knowledge-and-documents-maintenance/external-knowledge-api.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/introduction.mdx b/en/user-guide/knowledge-base/knowledge-and-documents-maintenance/introduction.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/introduction.mdx
rename to en/user-guide/knowledge-base/knowledge-and-documents-maintenance/introduction.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-dataset-via-api.mdx b/en/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-dataset-via-api.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-dataset-via-api.mdx
rename to en/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-dataset-via-api.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-knowledge-documents.mdx b/en/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-knowledge-documents.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-knowledge-documents.mdx
rename to en/user-guide/knowledge-base/knowledge-and-documents-maintenance/maintain-knowledge-documents.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-base-creation/introduction.mdx b/en/user-guide/knowledge-base/knowledge-base-creation/introduction.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-base-creation/introduction.mdx
rename to en/user-guide/knowledge-base/knowledge-base-creation/introduction.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-base-creation/sync-from-notion.mdx b/en/user-guide/knowledge-base/knowledge-base-creation/sync-from-notion.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-base-creation/sync-from-notion.mdx
rename to en/user-guide/knowledge-base/knowledge-base-creation/sync-from-notion.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-base-creation/sync-from-website.mdx b/en/user-guide/knowledge-base/knowledge-base-creation/sync-from-website.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-base-creation/sync-from-website.mdx
rename to en/user-guide/knowledge-base/knowledge-base-creation/sync-from-website.mdx
diff --git a/en-us/user-guide/knowledge-base/knowledge-base-creation/upload-documents.mdx b/en/user-guide/knowledge-base/knowledge-base-creation/upload-documents.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/knowledge-base-creation/upload-documents.mdx
rename to en/user-guide/knowledge-base/knowledge-base-creation/upload-documents.mdx
diff --git a/en-us/user-guide/knowledge-base/metadata.mdx b/en/user-guide/knowledge-base/metadata.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/metadata.mdx
rename to en/user-guide/knowledge-base/metadata.mdx
diff --git a/en-us/user-guide/knowledge-base/readme.mdx b/en/user-guide/knowledge-base/readme.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/readme.mdx
rename to en/user-guide/knowledge-base/readme.mdx
diff --git a/en-us/user-guide/knowledge-base/retrieval-test-and-citation.mdx b/en/user-guide/knowledge-base/retrieval-test-and-citation.mdx
similarity index 100%
rename from en-us/user-guide/knowledge-base/retrieval-test-and-citation.mdx
rename to en/user-guide/knowledge-base/retrieval-test-and-citation.mdx
diff --git a/scripts/md-to-mdx-3.18-backup.py b/scripts/md-to-mdx-3.18-backup.py
new file mode 100644
index 00000000..82cdfce0
--- /dev/null
+++ b/scripts/md-to-mdx-3.18-backup.py
@@ -0,0 +1,340 @@
+#!/usr/bin/env python3
+# -*- coding: utf-8 -*-
+
+import os
+import re
+import shutil
+from pathlib import Path
+import logging
+
+# 设置日志
+logging.basicConfig(
+ level=logging.INFO,
+ format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
+ handlers=[
+ logging.FileHandler("conversion.log"),
+ logging.StreamHandler()
+ ]
+)
+logger = logging.getLogger("md-to-mdx")
+
+class MarkdownToMDXConverter:
+ def __init__(self, backup=True):
+ self.backup = backup
+ self.conversion_count = 0
+ self.error_count = 0
+ self.base_output_dir = None
+
+ def process_directory(self, input_dir, output_dir=None, recursive=True):
+ """处理指定目录中的所有Markdown文件"""
+ input_path = Path(input_dir)
+
+ if not input_path.exists():
+ logger.error(f"输入目录不存在: {input_dir}")
+ return
+
+ # 保存基础输出目录,用于构建子目录输出路径
+ if self.base_output_dir is None and output_dir:
+ self.base_output_dir = Path(output_dir)
+ self.base_input_dir = input_path
+ self.base_output_dir.mkdir(parents=True, exist_ok=True)
+ logger.info(f"创建基础输出目录: {self.base_output_dir}")
+
+ # 处理当前目录中的所有.md文件
+ for file in input_path.glob("*.md"):
+ # 计算相对于基础输入目录的路径
+ if self.base_output_dir:
+ rel_path = file.parent.relative_to(self.base_input_dir) if file.parent != self.base_input_dir else Path('')
+ target_dir = self.base_output_dir / rel_path
+ target_dir.mkdir(parents=True, exist_ok=True)
+ self._process_file(file, target_dir)
+ else:
+ # 如果没有基础输出目录,则就地处理
+ self._process_file(file, file.parent)
+
+ # 如果需要递归处理子目录
+ if recursive:
+ for subdir in [d for d in input_path.iterdir() if d.is_dir()]:
+ # 跳过output目录,避免重复处理
+ if subdir.name == "output" or subdir.name.startswith('.'):
+ continue
+
+ self.process_directory(subdir, output_dir, recursive)
+
+ def _process_file(self, file_path, output_dir):
+ """处理单个Markdown文件"""
+ try:
+ logger.info(f"处理文件: {file_path}")
+
+ # 备份原始文件
+ if self.backup:
+ backup_file = str(file_path) + ".bak"
+ if not os.path.exists(backup_file):
+ shutil.copy2(file_path, backup_file)
+ logger.info(f"已创建备份: {backup_file}")
+
+ # 读取文件内容
+ with open(file_path, 'r', encoding='utf-8') as f:
+ content = f.read()
+
+ # 执行转换
+ converted_content = self.convert_content(content)
+
+ # 确定输出文件路径
+ output_file = output_dir / (file_path.stem + ".mdx")
+
+ # 写入转换后的内容
+ with open(output_file, 'w', encoding='utf-8') as f:
+ f.write(converted_content)
+
+ logger.info(f"转换完成: {output_file}")
+ self.conversion_count += 1
+
+ except Exception as e:
+ logger.error(f"处理文件 {file_path} 时出错: {str(e)}")
+ self.error_count += 1
+
+ def convert_content(self, content):
+ """将Gitbook Markdown内容转换为Mintlify MDX格式"""
+
+ # 1. 转换文档开头的h1元素为frontmatter
+ h1_pattern = re.compile(r'^#\s+(.+?)$', re.MULTILINE)
+ match = h1_pattern.search(content)
+ if match:
+ title = match.group(1).strip()
+ content = h1_pattern.sub(f'---\ntitle: {title}\n---\n', content, count=1)
+
+ # 2. 转换hint提示框
+ hint_pattern = re.compile(
+ r'{%\s*hint\s+style="(\w+)"\s*%}(.*?){%\s*endhint\s*%}',
+ re.DOTALL
+ )
+
+ def hint_replacer(match):
+ style = match.group(1)
+ text = match.group(2).strip()
+ component_name = style.capitalize() if style != "info" else "Info"
+ return f'<{component_name}>\n{text}\n{component_name}>'
+
+ content = hint_pattern.sub(hint_replacer, content)
+
+ # 3. 转换卡片链接
+ card_pattern = re.compile(
+ r'{%\s*content-ref\s+url="([^"]+)"\s*%}\s*\[([^\]]+)\]\(([^)]+)\)\s*{%\s*endcontent-ref\s*%}',
+ re.DOTALL
+ )
+
+ def card_replacer(match):
+ url = match.group(1)
+ title = match.group(2)
+ return f'\n {title}\n'
+
+ content = card_pattern.sub(card_replacer, content)
+
+ # 4. 转换并排图片样式
+ # 寻找连续的图片并转换为并排布局
+ img_pattern = re.compile(r'!\[(.*?)\]\((.*?)\)\s*!\[(.*?)\]\((.*?)\)', re.DOTALL)
+
+ def img_side_replacer(match):
+ alt1 = match.group(1) or "Image 1"
+ src1 = match.group(2)
+ alt2 = match.group(3) or "Image 2"
+ src2 = match.group(4)
+
+ return f'''