Files
librechat.ai/content/docs/quick_start/custom_endpoints.mdx
Marco Beretta c8dfc77b19 docs: rewrite high-traffic pages as actionable step-by-step guides (#530)
* docs(config): rewrite overview with FileTree and restart guide

- Replace vague marketing copy with actionable config guide
- Add FileTree showing 4 config files (.env, librechat.yaml, docker-compose.yml, docker-compose.override.yml)
- Add file descriptions explaining what each config file controls
- Add restart Callout with Docker/Local tabs
- Add next-step Cards linking to librechat.yaml setup, Docker setup, .env reference

* docs(config): restructure librechat.yaml with guide-first setup steps

- Replace feature list with 4-step setup procedure using Steps component
- Add Docker/Local Tabs for deployment-specific commands
- Add OpenRouter worked example with OPENROUTER_KEY warning
- Add Reference section with Cards linking to ai_endpoints and object_structure
- Merge troubleshooting content from setup.mdx into index page
- Remove outdated model menu screenshots

* refactor(config): merge setup.mdx into librechat.yaml and add redirect

- Delete setup.mdx (content merged into index.mdx)
- Remove setup from meta.json pages array
- Update ai_endpoints link from /setup to /librechat_yaml
- Add redirect for old /setup URL in next.config.mjs

* docs(endpoints): rewrite OpenRouter as end-to-end setup guide

- Replace bare YAML snippet with 5-step Steps component guide
- Add OPENROUTER_KEY vs OPENROUTER_API_KEY warning callout
- Add Docker/Local restart Tabs and verification step
- Add customization section and reference Cards
- Remove deprecated GitHub screenshot image

* docs(setup): rewrite Docker install with first-login flow and troubleshooting

- Add Steps-based installation with clone, env, start, and verify steps
- Document first account = admin behavior at localhost:3080
- Add librechat.yaml volume mounting section with override file pattern
- Add troubleshooting for port conflicts, container crashes, missing env vars
- Replace deprecated Additional Links with Cards component for next steps
- Cross-link to librechat.yaml guide, Docker override guide, and .env reference

* docs(endpoints): add file context and activation steps to custom endpoints

- Add 'Which File Does What' callout explaining librechat.yaml, .env, and docker-compose.override.yml roles
- Rewrite Step 4 as 'Restart and Verify' with Docker/Local Tabs
- Add troubleshooting callout for missing endpoints
- Replace deprecated AdditionalLinks with Cards for next steps
- Link to Configuration Overview for file relationship context

* docs(endpoints): fix OpenRouter Step tag indentation

* fix(docs): replace empty user guides hub with Cards and remove stale screenshot

- Rewrite user_guides/index.mdx as Cards hub with Guides and Popular Features sections
- Add cross-links to Agents, Image Gen, Web Search, and MCP feature pages
- Remove stale screenshot from LiteLLM page (110412045 asset)
- Verify auth section Next button works (SAML/auth0 -> pre_configured_ai)
- Verify S3 page has all required sections (no changes needed)

* docs(features): add Quick Start guide and cross-links to image generation

* docs(tools): rewrite Google Search with agent-first Steps and cross-links

* chore: update .gitignore

* fix: make Docker/Local tabs switch content when clicked

The TabCompat wrapper was rendering <Tabs.Tab> as plain <div> elements,
which never registered with fumadocs' internal tab context. Clicking tab
triggers had no effect because the content panels didn't respond to state
changes.

Fix: assign the real fumadocs Tab component as TabsCompat.Tab via
Object.assign, so <Tabs.Tab> in MDX renders the real Tab that registers
with the parent Tabs context. Keep standalone <Tab> (from auto-generated
code blocks with filename=) as a plain div fallback to avoid crashes
when there is no parent Tabs context.
2026-03-20 00:23:49 +01:00

148 lines
4.7 KiB
Plaintext

---
title: "Custom Endpoints"
icon: Plug
description: Add custom AI providers like OpenRouter and Ollama to LibreChat using librechat.yaml
---
LibreChat supports any OpenAI API-compatible service as a custom endpoint. You configure endpoints in `librechat.yaml`, store API keys in `.env`, and mount the config via `docker-compose.override.yml` for Docker deployments.
<Callout type="info" title="Which File Does What?">
Custom endpoint setup involves three files, each with a specific role:
1. **`librechat.yaml`** -- Defines your custom endpoints (name, API URL, models, display settings)
2. **`.env`** -- Stores sensitive values like API keys (referenced from librechat.yaml using `${VAR_NAME}` syntax)
3. **`docker-compose.override.yml`** -- Mounts `librechat.yaml` into the Docker container (Docker users only)
For a full overview of how these files work together, see the [Configuration Overview](/docs/configuration).
</Callout>
<Callout type="warning" title="Before You Start">
This guide assumes you have LibreChat installed and running. If not, complete the [Docker setup](/docs/local/docker) first.
</Callout>
## Step 1. Mount librechat.yaml (Docker Only)
Docker users need to mount `librechat.yaml` as a volume so the container can read it. Skip this step if you are running LibreChat locally without Docker.
```bash
cp docker-compose.override.yml.example docker-compose.override.yml
```
Edit `docker-compose.override.yml` and ensure the volume mount is uncommented:
```yaml filename="docker-compose.override.yml"
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
```
Learn more: [Docker Override Guide](/docs/configuration/docker_override)
## Step 2. Configure librechat.yaml
Create a `librechat.yaml` file in the project root (if it does not exist) and add your endpoint configuration. See the [librechat.yaml guide](/docs/configuration/librechat_yaml) for detailed setup instructions.
Here is an example with **OpenRouter** and **Ollama**:
```yaml filename="librechat.yaml"
version: 1.3.5
cache: true
endpoints:
custom:
- name: "OpenRouter"
apiKey: "${OPENROUTER_KEY}"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["meta-llama/llama-3-70b-instruct"]
fetch: true
titleConvo: true
titleModel: "meta-llama/llama-3-70b-instruct"
dropParams: ["stop"]
modelDisplayLabel: "OpenRouter"
- name: "Ollama"
apiKey: "ollama"
baseURL: "http://host.docker.internal:11434/v1/"
models:
default: ["llama3:latest", "command-r", "mixtral", "phi3"]
fetch: true
titleConvo: true
titleModel: "current_model"
```
Browse all compatible providers in the [AI Endpoints](/docs/configuration/librechat_yaml/ai_endpoints) section. For the full field reference, see [Custom Endpoint Object Structure](/docs/configuration/librechat_yaml/object_structure/custom_endpoint).
<Callout type="warning" title="API Key Configuration">
When configuring API keys in custom endpoints, you have three options:
1. **Environment variable** (recommended): `apiKey: "${OPENROUTER_KEY}"` -- reads from `.env`
2. **User provided**: `apiKey: "user_provided"` -- users enter their own key in the UI
3. **Direct value** (not recommended): `apiKey: "sk-your-actual-key"` -- stored in plain text
</Callout>
## Step 3. Set Environment Variables
Add the API keys referenced in your `librechat.yaml` to the `.env` file:
```bash filename=".env"
OPENROUTER_KEY=your_openrouter_api_key
```
Each `${VARIABLE_NAME}` in librechat.yaml must have a matching entry in `.env`.
## Step 4. Restart and Verify
After editing configuration files, you must restart LibreChat for changes to take effect.
<Tabs items={['Docker', 'Local']}>
<Tabs.Tab>
```bash
docker compose down && docker compose up -d
```
</Tabs.Tab>
<Tabs.Tab>
Stop the running process (Ctrl+C) and restart:
```bash
npm run backend
```
</Tabs.Tab>
</Tabs>
Open LibreChat in your browser. Your custom endpoints should appear in the endpoint selector dropdown.
<Callout type="info" title="Not Seeing Your Endpoint?">
Check the server logs for configuration errors:
```bash
docker compose logs api
```
Common issues: YAML syntax errors, missing env vars, or `librechat.yaml` not mounted in Docker. Validate your YAML with the [YAML Validator](/toolkit/yaml_checker).
</Callout>
## Next Steps
<Cards num={2}>
<Cards.Card title="AI Endpoints" href="/docs/configuration/librechat_yaml/ai_endpoints" arrow>
Browse all compatible AI providers with example configurations
</Cards.Card>
<Cards.Card title="librechat.yaml Guide" href="/docs/configuration/librechat_yaml" arrow>
Full setup guide and reference for the config file
</Cards.Card>
</Cards>