---
title: "Custom Endpoints"
icon: Plug
description: Add custom AI providers like OpenRouter and Ollama to LibreChat using librechat.yaml
---
LibreChat supports any OpenAI API-compatible service as a custom endpoint. You configure endpoints in `librechat.yaml`, store API keys in `.env`, and mount the config via `docker-compose.override.yml` for Docker deployments.
Custom endpoint setup involves three files, each with a specific role:
1. **`librechat.yaml`** -- Defines your custom endpoints (name, API URL, models, display settings)
2. **`.env`** -- Stores sensitive values like API keys (referenced from librechat.yaml using `${VAR_NAME}` syntax)
3. **`docker-compose.override.yml`** -- Mounts `librechat.yaml` into the Docker container (Docker users only)
For a full overview of how these files work together, see the [Configuration Overview](/docs/configuration).
This guide assumes you have LibreChat installed and running. If not, complete the [Docker setup](/docs/local/docker) first.
## Step 1. Mount librechat.yaml (Docker Only)
Docker users need to mount `librechat.yaml` as a volume so the container can read it. Skip this step if you are running LibreChat locally without Docker.
```bash
cp docker-compose.override.yml.example docker-compose.override.yml
```
Edit `docker-compose.override.yml` and ensure the volume mount is uncommented:
```yaml filename="docker-compose.override.yml"
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
```
Learn more: [Docker Override Guide](/docs/configuration/docker_override)
## Step 2. Configure librechat.yaml
Create a `librechat.yaml` file in the project root (if it does not exist) and add your endpoint configuration. See the [librechat.yaml guide](/docs/configuration/librechat_yaml) for detailed setup instructions.
Here is an example with **OpenRouter** and **Ollama**:
```yaml filename="librechat.yaml"
version: 1.3.5
cache: true
endpoints:
custom:
- name: "OpenRouter"
apiKey: "${OPENROUTER_KEY}"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["meta-llama/llama-3-70b-instruct"]
fetch: true
titleConvo: true
titleModel: "meta-llama/llama-3-70b-instruct"
dropParams: ["stop"]
modelDisplayLabel: "OpenRouter"
- name: "Ollama"
apiKey: "ollama"
baseURL: "http://host.docker.internal:11434/v1/"
models:
default: ["llama3:latest", "command-r", "mixtral", "phi3"]
fetch: true
titleConvo: true
titleModel: "current_model"
```
Browse all compatible providers in the [AI Endpoints](/docs/configuration/librechat_yaml/ai_endpoints) section. For the full field reference, see [Custom Endpoint Object Structure](/docs/configuration/librechat_yaml/object_structure/custom_endpoint).
When configuring API keys in custom endpoints, you have three options:
1. **Environment variable** (recommended): `apiKey: "${OPENROUTER_KEY}"` -- reads from `.env`
2. **User provided**: `apiKey: "user_provided"` -- users enter their own key in the UI
3. **Direct value** (not recommended): `apiKey: "sk-your-actual-key"` -- stored in plain text
## Step 3. Set Environment Variables
Add the API keys referenced in your `librechat.yaml` to the `.env` file:
```bash filename=".env"
OPENROUTER_KEY=your_openrouter_api_key
```
Each `${VARIABLE_NAME}` in librechat.yaml must have a matching entry in `.env`.
## Step 4. Restart and Verify
After editing configuration files, you must restart LibreChat for changes to take effect.
```bash
docker compose down && docker compose up -d
```
Stop the running process (Ctrl+C) and restart:
```bash
npm run backend
```
Open LibreChat in your browser. Your custom endpoints should appear in the endpoint selector dropdown.
Check the server logs for configuration errors:
```bash
docker compose logs api
```
Common issues: YAML syntax errors, missing env vars, or `librechat.yaml` not mounted in Docker. Validate your YAML with the [YAML Validator](/toolkit/yaml_checker).
## Next Steps
Browse all compatible AI providers with example configurations
Full setup guide and reference for the config file