update local source develop guide (#596)

* update local source develop guide

* improve formatting and capitalization

* Update en/self-host/advanced-deployments/local-source-code.mdx

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* update the zh and jp docs

---------

Co-authored-by: Riskey <riskey47@dify.ai>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
This commit is contained in:
非法操作
2025-12-12 16:01:33 +08:00
committed by GitHub
parent 57e04ab0b1
commit 18b4e1683d
3 changed files with 48 additions and 12 deletions

View File

@@ -30,8 +30,12 @@ A series of middlewares for storage (e.g. PostgreSQL / Redis / Weaviate (if not
```Bash
cd docker
cp middleware.env.example middleware.env
docker compose -f docker-compose.middleware.yaml up -d
# change the profile to mysql if you are not using postgresql
# change the profile to other vector database if you are not using weaviate
docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
```
---
@@ -75,7 +79,7 @@ The backend services include
[uv](https://docs.astral.sh/uv/getting-started/installation/) is used to manage dependencies.
Install the required dependencies with `uv` by running:
```
uv sync
uv sync --dev
```
> For macOS: install libmagic with `brew install libmagic`.
@@ -111,14 +115,14 @@ To consume asynchronous tasks from the queue, such as dataset file import and da
- for macOS or Linux
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
If you are using a Windows system to start the Worker service, please use the following command instead:
- for Windows
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
Expected output:
@@ -189,6 +193,14 @@ To consume asynchronous tasks from the queue, such as dataset file import and da
2025-04-28 17:07:15,742 INFO [pidbox.py:111] pidbox: Connected to redis://:**@localhost:6379/1.
```
### Start the Beat Service
Additionally, if you want to debug the celery scheduled tasks or run the Schedule Trigger node, you can run the following command in another terminal to start the beat service:
```bash
uv run celery -A app.celery beat
```
---
## Setup Web Service

View File

@@ -35,8 +35,12 @@ git clone https://github.com/langgenius/dify.git
```Bash
cd docker
cp middleware.env.example middleware.env
docker compose -f docker-compose.middleware.yaml up -d
# postgresql を使用していない場合は profile を mysql に変更してください
# weaviate を使用していない場合は profile を他のベクターデータベースに変更してください
docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
```
***
@@ -80,7 +84,7 @@ docker compose -f docker-compose.middleware.yaml up -d
依存関係の管理には [uv](https://docs.astral.sh/uv/getting-started/installation/) を使用します。
`uv` で必要な依存関係をインストールするには、次のコマンドを実行します:
```
uv sync
uv sync --dev
```
> macOS の場合:`brew install libmagic` で libmagic をインストールしてください。
@@ -116,14 +120,14 @@ docker compose -f docker-compose.middleware.yaml up -d
- macOS または Linux の場合
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
Windows システムで Worker サービスを起動する場合は、代わりに次のコマンドを使用してください:
- Windows の場合
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
期待される出力:
@@ -194,6 +198,14 @@ docker compose -f docker-compose.middleware.yaml up -d
2025-04-28 17:07:15,742 INFO [pidbox.py:111] pidbox: Connected to redis://:**@localhost:6379/1.
```
### Beat サービスの起動
さらに、Celery のスケジュールタスクをデバッグしたり、Schedule Trigger ノードを実行したい場合は、別のターミナルで以下のコマンドを実行して Beat サービスを起動できます:
```bash
uv run celery -A app.celery beat
```
---
## Web サービスのセットアップ

View File

@@ -34,8 +34,12 @@ Dify 后端服务需要一系列用于存储(如 PostgreSQL / Redis / Weaviate
```Bash
cd docker
cp middleware.env.example middleware.env
docker compose -f docker-compose.middleware.yaml up -d
# 如果不使用 postgresql请将 profile 更改为 mysql
# 如果不使用 weaviate请将 profile 更改为其他向量数据库
docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
```
---
@@ -79,7 +83,7 @@ docker compose -f docker-compose.middleware.yaml up -d
使用 [uv](https://docs.astral.sh/uv/getting-started/installation/) 管理依赖。
通过运行以下命令使用 `uv` 安装所需依赖:
```
uv sync
uv sync --dev
```
> 对于 macOS使用 `brew install libmagic` 安装 libmagic。
@@ -115,14 +119,14 @@ docker compose -f docker-compose.middleware.yaml up -d
- 对于 macOS 或 Linux
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
如果你使用 Windows 系统启动 Worker 服务,请使用以下命令:
- 对于 Windows
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
预期输出:
@@ -193,6 +197,14 @@ docker compose -f docker-compose.middleware.yaml up -d
2025-04-28 17:07:15,742 INFO [pidbox.py:111] pidbox: Connected to redis://:**@localhost:6379/1.
```
### 启动 Beat 服务
如需调试 celery 定时任务或运行定时触发器节点,可以在另一个终端中运行以下命令启动 beat 服务:
```bash
uv run celery -A app.celery beat
```
---
## 设置 Web 服务