update local source develop guide (#596)

* update local source develop guide

* improve formatting and capitalization

* Update en/self-host/advanced-deployments/local-source-code.mdx

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* update the zh and jp docs

---------

Co-authored-by: Riskey <riskey47@dify.ai>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
This commit is contained in:
非法操作
2025-12-12 16:01:33 +08:00
committed by GitHub
parent 57e04ab0b1
commit 18b4e1683d
3 changed files with 48 additions and 12 deletions

View File

@@ -30,8 +30,12 @@ A series of middlewares for storage (e.g. PostgreSQL / Redis / Weaviate (if not
```Bash
cd docker
cp middleware.env.example middleware.env
docker compose -f docker-compose.middleware.yaml up -d
# change the profile to mysql if you are not using postgresql
# change the profile to other vector database if you are not using weaviate
docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
```
---
@@ -75,7 +79,7 @@ The backend services include
[uv](https://docs.astral.sh/uv/getting-started/installation/) is used to manage dependencies.
Install the required dependencies with `uv` by running:
```
uv sync
uv sync --dev
```
> For macOS: install libmagic with `brew install libmagic`.
@@ -111,14 +115,14 @@ To consume asynchronous tasks from the queue, such as dataset file import and da
- for macOS or Linux
```
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace
uv run celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
If you are using a Windows system to start the Worker service, please use the following command instead:
- for Windows
```
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO
uv run celery -A app.celery worker -P solo --without-gossip --without-mingle --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
Expected output:
@@ -189,6 +193,14 @@ To consume asynchronous tasks from the queue, such as dataset file import and da
2025-04-28 17:07:15,742 INFO [pidbox.py:111] pidbox: Connected to redis://:**@localhost:6379/1.
```
### Start the Beat Service
Additionally, if you want to debug the celery scheduled tasks or run the Schedule Trigger node, you can run the following command in another terminal to start the beat service:
```bash
uv run celery -A app.celery beat
```
---
## Setup Web Service