docs(self-host): supplement missing workflow_based_app_execution queue for Worker service in local startup docs (#694)

* docs(self-host): supplement missing `workflow_based_app_execution` queue for Worker service in local startup docs

Update Chinese, English and Japanese versions of self-hosted documentation by adding the missing `workflow_based_app_execution` queue to the Celery worker startup command.

* add missing `dataset_summary` and `retention` queues to worker startup command

* add missing `dataset_summary` and `retention` queues to worker startup command

* add missing `dataset_summary` and `retention` queues to worker startup command

---------

Co-authored-by: Riskey <36894937+RiskeyL@users.noreply.github.com>
This commit is contained in:
lexmin0412
2026-03-06 22:22:57 +08:00
committed by GitHub
parent f84d7ab662
commit ed97c4837e
3 changed files with 8 additions and 8 deletions

View File

@@ -115,14 +115,14 @@ To consume asynchronous tasks from the queue, such as dataset file import and da
- for macOS or Linux
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution
```
If you are using a Windows system to start the Worker service, please use the following command instead:
- for Windows
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution
```
Expected output:

View File

@@ -120,14 +120,14 @@ docker compose -f docker-compose.middleware.yaml --profile postgresql --profile
- macOS または Linux の場合
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution
```
Windows システムで Worker サービスを起動する場合は、代わりに次のコマンドを使用してください:
- Windows の場合
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution
```
期待される出力:
@@ -293,4 +293,4 @@ Web フロントエンドサービスを起動するには、[Node.js v22 (LTS)]
### Dify にアクセスする
ブラウザで [http://127.0.0.1:3000](http://127.0.0.1:3000/) にアクセスして、Dify のすべての機能をお楽しみください。
乾杯!🍻
乾杯!🍻

View File

@@ -119,14 +119,14 @@ docker compose -f docker-compose.middleware.yaml --profile postgresql --profile
- 对于 macOS 或 Linux
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution
```
如果你使用 Windows 系统启动 Worker 服务,请使用以下命令:
- 对于 Windows
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution
```
预期输出:
@@ -292,4 +292,4 @@ uv run celery -A app.celery beat
### 访问 Dify
通过浏览器访问 [http://127.0.0.1:3000](http://127.0.0.1:3000/) 即可享受 Dify 所有激动人心的功能。
干杯!🍻
干杯!🍻