Files
librechat.ai/content/docs/configuration/librechat_yaml/ai_endpoints/openrouter.mdx
Marco Beretta c8dfc77b19 docs: rewrite high-traffic pages as actionable step-by-step guides (#530)
* docs(config): rewrite overview with FileTree and restart guide

- Replace vague marketing copy with actionable config guide
- Add FileTree showing 4 config files (.env, librechat.yaml, docker-compose.yml, docker-compose.override.yml)
- Add file descriptions explaining what each config file controls
- Add restart Callout with Docker/Local tabs
- Add next-step Cards linking to librechat.yaml setup, Docker setup, .env reference

* docs(config): restructure librechat.yaml with guide-first setup steps

- Replace feature list with 4-step setup procedure using Steps component
- Add Docker/Local Tabs for deployment-specific commands
- Add OpenRouter worked example with OPENROUTER_KEY warning
- Add Reference section with Cards linking to ai_endpoints and object_structure
- Merge troubleshooting content from setup.mdx into index page
- Remove outdated model menu screenshots

* refactor(config): merge setup.mdx into librechat.yaml and add redirect

- Delete setup.mdx (content merged into index.mdx)
- Remove setup from meta.json pages array
- Update ai_endpoints link from /setup to /librechat_yaml
- Add redirect for old /setup URL in next.config.mjs

* docs(endpoints): rewrite OpenRouter as end-to-end setup guide

- Replace bare YAML snippet with 5-step Steps component guide
- Add OPENROUTER_KEY vs OPENROUTER_API_KEY warning callout
- Add Docker/Local restart Tabs and verification step
- Add customization section and reference Cards
- Remove deprecated GitHub screenshot image

* docs(setup): rewrite Docker install with first-login flow and troubleshooting

- Add Steps-based installation with clone, env, start, and verify steps
- Document first account = admin behavior at localhost:3080
- Add librechat.yaml volume mounting section with override file pattern
- Add troubleshooting for port conflicts, container crashes, missing env vars
- Replace deprecated Additional Links with Cards component for next steps
- Cross-link to librechat.yaml guide, Docker override guide, and .env reference

* docs(endpoints): add file context and activation steps to custom endpoints

- Add 'Which File Does What' callout explaining librechat.yaml, .env, and docker-compose.override.yml roles
- Rewrite Step 4 as 'Restart and Verify' with Docker/Local Tabs
- Add troubleshooting callout for missing endpoints
- Replace deprecated AdditionalLinks with Cards for next steps
- Link to Configuration Overview for file relationship context

* docs(endpoints): fix OpenRouter Step tag indentation

* fix(docs): replace empty user guides hub with Cards and remove stale screenshot

- Rewrite user_guides/index.mdx as Cards hub with Guides and Popular Features sections
- Add cross-links to Agents, Image Gen, Web Search, and MCP feature pages
- Remove stale screenshot from LiteLLM page (110412045 asset)
- Verify auth section Next button works (SAML/auth0 -> pre_configured_ai)
- Verify S3 page has all required sections (no changes needed)

* docs(features): add Quick Start guide and cross-links to image generation

* docs(tools): rewrite Google Search with agent-first Steps and cross-links

* chore: update .gitignore

* fix: make Docker/Local tabs switch content when clicked

The TabCompat wrapper was rendering <Tabs.Tab> as plain <div> elements,
which never registered with fumadocs' internal tab context. Clicking tab
triggers had no effect because the content panels didn't respond to state
changes.

Fix: assign the real fumadocs Tab component as TabsCompat.Tab via
Object.assign, so <Tabs.Tab> in MDX renders the real Tab that registers
with the parent Tabs context. Keep standalone <Tab> (from auto-generated
code blocks with filename=) as a plain div fallback to avoid crashes
when there is no parent Tabs context.
2026-03-20 00:23:49 +01:00

150 lines
4.0 KiB
Plaintext

---
title: OpenRouter
icon: OpenRouter
description: Complete setup guide for using OpenRouter as a custom endpoint in LibreChat
---
[OpenRouter](https://openrouter.ai/) provides access to hundreds of AI models through a single API. This guide walks you through setting up OpenRouter as a custom endpoint in LibreChat from scratch.
<Callout type="info" title="Prerequisites">
Before starting, make sure you have:
- LibreChat installed and running (see [Docker setup](/docs/local/docker))
- A `librechat.yaml` file created and mounted (see [librechat.yaml guide](/docs/configuration/librechat_yaml))
</Callout>
## Setup
<Steps>
<Step>
### Get an API Key
Create an account at [openrouter.ai](https://openrouter.ai/) and generate an API key from the [Keys page](https://openrouter.ai/keys).
Copy the key -- it starts with `sk-or-v1-`.
</Step>
<Step>
### Add the Key to Your .env File
Open your `.env` file in the project root and add your OpenRouter API key:
```bash filename=".env"
OPENROUTER_KEY=sk-or-v1-your-key-here
```
<Callout type="error" title="Use OPENROUTER_KEY, Not OPENROUTER_API_KEY">
You must use `OPENROUTER_KEY` as the variable name. Using `OPENROUTER_API_KEY` will override the built-in OpenAI endpoint to use OpenRouter as well, which is almost certainly not what you want.
</Callout>
</Step>
<Step>
### Add the Endpoint to librechat.yaml
Add the following to your `librechat.yaml` file. If the file already has content, merge the `endpoints` section with your existing configuration:
```yaml filename="librechat.yaml"
version: 1.3.5
cache: true
endpoints:
custom:
- name: "OpenRouter"
apiKey: "${OPENROUTER_KEY}"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["meta-llama/llama-3-70b-instruct"]
fetch: true
titleConvo: true
titleModel: "meta-llama/llama-3-70b-instruct"
dropParams: ["stop"]
modelDisplayLabel: "OpenRouter"
```
Key fields explained:
| Field | Purpose |
|-------|---------|
| `apiKey: "${OPENROUTER_KEY}"` | References the env var from Step 2. The `${}` syntax tells LibreChat to read the value from `.env`. |
| `models.fetch: true` | Fetches the full model list from OpenRouter's API, so new models appear automatically. |
| `dropParams: ["stop"]` | Removes the `stop` parameter from requests. OpenRouter models use varied stop tokens, so dropping this avoids compatibility issues. |
| `modelDisplayLabel: "OpenRouter"` | The name shown in LibreChat's endpoint selector. |
</Step>
<Step>
### Restart LibreChat
<Tabs items={['Docker', 'Local']}>
<Tabs.Tab>
```bash
docker compose down && docker compose up -d
```
</Tabs.Tab>
<Tabs.Tab>
Stop the running process (Ctrl+C) and restart:
```bash
npm run backend
```
</Tabs.Tab>
</Tabs>
</Step>
<Step>
### Verify It Works
Open LibreChat in your browser. You should see **OpenRouter** in the endpoint selector dropdown. Select it to see the available models.
If OpenRouter does not appear, check the server logs for configuration errors:
```bash
docker compose logs api | grep -i "error\|openrouter"
```
</Step>
</Steps>
## Customization
### Using user_provided API Key
Instead of storing the key in `.env`, you can let each user provide their own key through the LibreChat UI:
```yaml
apiKey: "user_provided"
```
Users will see a key input field when selecting the OpenRouter endpoint.
### Limiting Available Models
Instead of fetching all models, you can specify a fixed list:
```yaml
models:
default: ["anthropic/claude-3.5-sonnet", "openai/gpt-4o", "meta-llama/llama-3-70b-instruct"]
fetch: false
```
## Reference
<Cards num={2}>
<Cards.Card title="Custom Endpoint Fields" href="/docs/configuration/librechat_yaml/object_structure/custom_endpoint" arrow>
All available fields for custom endpoint configuration
</Cards.Card>
<Cards.Card title="Full Example Config" href="/docs/configuration/librechat_yaml/example" arrow>
Complete annotated librechat.yaml with multiple endpoints
</Cards.Card>
</Cards>