mirror of
https://github.com/open-webui/docs.git
synced 2026-03-26 13:18:42 +07:00
@@ -31,7 +31,7 @@ You can use:
|
||||
- **Google Gemini** (via their [OpenAI-compatible endpoint](https://generativelanguage.googleapis.com/v1beta/openai/))
|
||||
- **DeepSeek** (https://platform.deepseek.com/)
|
||||
- **MiniMax** (https://platform.minimax.io/)
|
||||
- **Proxies & Aggregators**: OpenRouter, LiteLLM, Helicone.
|
||||
- **Proxies & Aggregators**: OpenRouter, LiteLLM, Helicone, Vercel AI Gateway.
|
||||
- **Local Servers**: Ollama, Llama.cpp, LM Studio, vLLM, LocalAI.
|
||||
|
||||
---
|
||||
@@ -50,7 +50,7 @@ Once Open WebUI is running:
|
||||
<Tabs>
|
||||
<TabItem value="standard" label="Standard / Compatible" default>
|
||||
|
||||
Use this for **OpenAI**, **DeepSeek**, **MiniMax**, **OpenRouter**, **LocalAI**, **FastChat**, **Helicone**, **LiteLLM**, etc.
|
||||
Use this for **OpenAI**, **DeepSeek**, **MiniMax**, **OpenRouter**, **LocalAI**, **FastChat**, **Helicone**, **LiteLLM**, **Vercel AI Gateway** etc.
|
||||
|
||||
* **Connection Type**: External
|
||||
* **URL**: `https://api.openai.com/v1` (or your provider's endpoint)
|
||||
|
||||
@@ -56,7 +56,7 @@ Now, link MiniMax to your Open WebUI instance.
|
||||
3. Enter the following details:
|
||||
- **API Base URL**: `https://api.minimax.io/v1`
|
||||
- **API Key**: `YOUR_CODING_PLAN_API_KEY`
|
||||
4. **Important**: MiniMax does not expose models via a `/models` endpoint, so you must whitelist the model manually.
|
||||
4. **Important**: MiniMax does not expose models via a `/models` endpoint, so you must whitelist the model manually. The connection verify button would also return with an error of "OpenAI: Network Problem" error because of this.
|
||||
5. In the **Model Whitelist**, type `MiniMax-M2.1` and click the **+** icon.
|
||||
6. Click **Verify Connection** (you should see a success alert).
|
||||
7. Click **Save** on the connection popup, then scroll down and click **Save** on the main Connections page.
|
||||
@@ -77,3 +77,4 @@ You are now ready to use MiniMax M2.1!
|
||||
---
|
||||
|
||||
Enjoy using one of the best and most affordable coding-focused models! 🚀
|
||||
|
||||
|
||||
Reference in New Issue
Block a user