From e7c80da6024a56d97a321c62ca8381ade5766084 Mon Sep 17 00:00:00 2001 From: myhloli Date: Wed, 19 Nov 2025 20:08:50 +0800 Subject: [PATCH] fix: update Python version support details for Windows and clarify dependency limitations --- README.md | 5 +++-- README_zh-CN.md | 5 +++-- docs/en/quick_start/index.md | 6 ++++-- docs/zh/quick_start/index.md | 4 +++- 4 files changed, 13 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 87625912..b5531998 100644 --- a/README.md +++ b/README.md @@ -684,7 +684,7 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing Python Version - 3.10-3.13 + 3.10-3.137 @@ -694,7 +694,8 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing 3 MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher. 4 Windows vLLM support via WSL2(Windows Subsystem for Linux). 5 Windows LMDeploy can only use the `turbomind` backend, which is slightly slower than the `pytorch` backend. If performance is critical, it is recommended to run it via WSL2. -6 Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like `vLLM`, `SGLang`, or `LMDeploy`. +6 Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like `vLLM`, `SGLang`, or `LMDeploy`. +7 Windows + LMDeploy only supports Python versions 3.10–3.12, as the critical dependency `ray` does not yet support Python 3.13 on Windows. ### Install MinerU diff --git a/README_zh-CN.md b/README_zh-CN.md index cecc891c..b0a492a1 100644 --- a/README_zh-CN.md +++ b/README_zh-CN.md @@ -671,7 +671,7 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c python版本 - 3.10-3.13 + 3.10-3.137 @@ -681,7 +681,8 @@ https://github.com/user-attachments/assets/4bea02c9-6d54-4cd6-97ed-dff14340982c 3 MLX需macOS 13.5及以上版本支持,推荐14.0以上版本使用 4 Windows vLLM通过WSL2(适用于 Linux 的 Windows 子系统)实现支持 5 Windows LMDeploy只能使用`turbomind`后端,速度比`pytorch`后端稍慢,如对速度有要求建议通过WSL2运行 -6 兼容OpenAI API的服务器,如通过`vLLM`/`SGLang`/`LMDeploy`等推理框架部署的本地模型服务器或远程模型服务 +6 兼容OpenAI API的服务器,如通过`vLLM`/`SGLang`/`LMDeploy`等推理框架部署的本地模型服务器或远程模型服务 +7 Windows + LMDeploy 由于关键依赖`ray`未能在windows平台支持Python 3.13,故仅支持至3.10~3.12版本 > [!TIP] > 除以上主流环境与平台外,我们也收录了一些社区用户反馈的其他平台支持情况,详情请参考[其他加速卡适配](https://opendatalab.github.io/MinerU/zh/usage/)。 diff --git a/docs/en/quick_start/index.md b/docs/en/quick_start/index.md index 0be3b2ae..c2a94399 100644 --- a/docs/en/quick_start/index.md +++ b/docs/en/quick_start/index.md @@ -83,7 +83,7 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing Python Version - 3.10-3.13 + 3.10-3.137 @@ -93,7 +93,9 @@ A WebUI developed based on Gradio, with a simple interface and only core parsing 3 MLX requires macOS 13.5 or later, recommended for use with version 14.0 or higher. 4 Windows vLLM support via WSL2(Windows Subsystem for Linux). 5 Windows LMDeploy can only use the `turbomind` backend, which is slightly slower than the `pytorch` backend. If performance is critical, it is recommended to run it via WSL2. -6 Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like `vLLM`, `SGLang`, or `LMDeploy`. +6 Servers compatible with the OpenAI API, such as local or remote model services deployed via inference frameworks like `vLLM`, `SGLang`, or `LMDeploy`. +7 Windows + LMDeploy only supports Python versions 3.10–3.12, as the critical dependency `ray` does not yet support Python 3.13 on Windows. + ### Install MinerU diff --git a/docs/zh/quick_start/index.md b/docs/zh/quick_start/index.md index 657184a1..b56c0635 100644 --- a/docs/zh/quick_start/index.md +++ b/docs/zh/quick_start/index.md @@ -83,7 +83,7 @@ python版本 - 3.10-3.13 + 3.10-3.137 @@ -94,6 +94,8 @@ 4 Windows vLLM通过WSL2(适用于 Linux 的 Windows 子系统)实现支持 5 Windows LMDeploy只能使用`turbomind`后端,速度比`pytorch`后端稍慢,如对速度有要求建议通过WSL2运行 6 兼容OpenAI API的服务器,如通过`vLLM`/`SGLang`/`LMDeploy`等推理框架部署的本地模型服务器或远程模型服务 +7 Windows + LMDeploy 由于关键依赖`ray`未能在windows平台支持Python 3.13,故仅支持至3.10~3.12版本 + > [!TIP] > 除以上主流环境与平台外,我们也收录了一些社区用户反馈的其他平台支持情况,详情请参考[其他加速卡适配](https://opendatalab.github.io/MinerU/zh/usage/)。