From ed97c4837eda7b22bc013db7cd8b157055b1e63d Mon Sep 17 00:00:00 2001 From: lexmin0412 Date: Fri, 6 Mar 2026 22:22:57 +0800 Subject: [PATCH] docs(self-host): supplement missing `workflow_based_app_execution` queue for Worker service in local startup docs (#694) * docs(self-host): supplement missing `workflow_based_app_execution` queue for Worker service in local startup docs Update Chinese, English and Japanese versions of self-hosted documentation by adding the missing `workflow_based_app_execution` queue to the Celery worker startup command. * add missing `dataset_summary` and `retention` queues to worker startup command * add missing `dataset_summary` and `retention` queues to worker startup command * add missing `dataset_summary` and `retention` queues to worker startup command --------- Co-authored-by: Riskey <36894937+RiskeyL@users.noreply.github.com> --- en/self-host/advanced-deployments/local-source-code.mdx | 4 ++-- ja/self-host/advanced-deployments/local-source-code.mdx | 6 +++--- zh/self-host/advanced-deployments/local-source-code.mdx | 6 +++--- 3 files changed, 8 insertions(+), 8 deletions(-) diff --git a/en/self-host/advanced-deployments/local-source-code.mdx b/en/self-host/advanced-deployments/local-source-code.mdx index 3e836000..74419aa2 100644 --- a/en/self-host/advanced-deployments/local-source-code.mdx +++ b/en/self-host/advanced-deployments/local-source-code.mdx @@ -115,14 +115,14 @@ To consume asynchronous tasks from the queue, such as dataset file import and da - for macOS or Linux ``` - uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor + uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution ``` If you are using a Windows system to start the Worker service, please use the following command instead: - for Windows ``` - uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor + uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution ``` Expected output: diff --git a/ja/self-host/advanced-deployments/local-source-code.mdx b/ja/self-host/advanced-deployments/local-source-code.mdx index 8237e29e..3209e7cb 100644 --- a/ja/self-host/advanced-deployments/local-source-code.mdx +++ b/ja/self-host/advanced-deployments/local-source-code.mdx @@ -120,14 +120,14 @@ docker compose -f docker-compose.middleware.yaml --profile postgresql --profile - macOS または Linux の場合 ``` - uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor + uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution ``` Windows システムで Worker サービスを起動する場合は、代わりに次のコマンドを使用してください: - Windows の場合 ``` - uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor + uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution ``` 期待される出力: @@ -293,4 +293,4 @@ Web フロントエンドサービスを起動するには、[Node.js v22 (LTS)] ### Dify にアクセスする ブラウザで [http://127.0.0.1:3000](http://127.0.0.1:3000/) にアクセスして、Dify のすべての機能をお楽しみください。 -乾杯!🍻 \ No newline at end of file +乾杯!🍻 diff --git a/zh/self-host/advanced-deployments/local-source-code.mdx b/zh/self-host/advanced-deployments/local-source-code.mdx index a706ef70..bda0152c 100644 --- a/zh/self-host/advanced-deployments/local-source-code.mdx +++ b/zh/self-host/advanced-deployments/local-source-code.mdx @@ -119,14 +119,14 @@ docker compose -f docker-compose.middleware.yaml --profile postgresql --profile - 对于 macOS 或 Linux ``` - uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor + uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution ``` 如果你使用 Windows 系统启动 Worker 服务,请使用以下命令: - 对于 Windows ``` - uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor + uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,dataset_summary,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention,workflow_based_app_execution ``` 预期输出: @@ -292,4 +292,4 @@ uv run celery -A app.celery beat ### 访问 Dify 通过浏览器访问 [http://127.0.0.1:3000](http://127.0.0.1:3000/) 即可享受 Dify 所有激动人心的功能。 -干杯!🍻 \ No newline at end of file +干杯!🍻