---
title: "Custom Endpoints"
icon: Plug
---
import AdditionalLinks from '@/components/repeated/AdditionalLinks.mdx';
LibreChat supports OpenAI API compatible services using the `librechat.yaml` configuration file.
This guide assumes you have already set up LibreChat using Docker, as shown in the **[Local Setup Guide](/docs/quick_start/local_setup).**
LibreChat uses several configuration files, each with specific purposes:
1. **librechat.yaml** - Used for custom endpoints configuration and other application settings
2. **.env file** - Used for server configuration, pre-configured endpoint API keys, and authentication settings
3. **docker-compose.override.yml** - Used for Docker-specific configurations and mounting volumes
## Step 1. Create or Edit a Docker Override File
- Create a file named `docker-compose.override.yml` file at the project root (if it doesn't already exist).
- Add the following content to the file:
```yaml
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
```
> Learn more about the [Docker Compose Override File here](/docs/configuration/docker_override).
## Step 2. Configure `librechat.yaml`
- **Create a file named `librechat.yaml`** at the project root (if it doesn't already exist).
- **Add your custom endpoints:** you can view compatible endpoints in the [AI Endpoints section](/docs/configuration/librechat_yaml/ai_endpoints).
- The list is not exhaustive and generally every OpenAI API-compatible service should work.
- There are many options for Custom Endpoints. View them all here: [Custom Endpoint Object Structure](/docs/configuration/librechat_yaml/object_structure/custom_endpoint).
- As an example, here is a configuration for both **OpenRouter** and **Ollama**:
```yaml
version: 1.3.4
cache: true
endpoints:
custom:
- name: "OpenRouter"
apiKey: "${OPENROUTER_KEY}"
baseURL: "https://openrouter.ai/api/v1"
models:
default: ["gpt-3.5-turbo"]
fetch: true
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
modelDisplayLabel: "OpenRouter"
- name: "Ollama"
apiKey: "ollama"
baseURL: "http://host.docker.internal:11434/v1/"
models:
default: [
"llama3:latest",
"command-r",
"mixtral",
"phi3"
]
fetch: true # fetching list of models is not supported
titleConvo: true
titleModel: "current_model"
```
When configuring API keys in custom endpoints, you have two options:
1. **Environment Variable Reference**: Use `${VARIABLE_NAME}` syntax to reference a variable from your .env file (recommended for security)
```yaml
apiKey: "${OPENROUTER_KEY}"
```
2. **User Provided**: Set to `"user_provided"` (without the ${} syntax) to allow users to enter their own API key through the web interface
```yaml
apiKey: "user_provided"
```
3. **Direct Value**: Directly include the API key in the configuration file (not recommended for security reasons)
```yaml
apiKey: "your-actual-api-key"
```
This is different from pre-configured endpoints in the .env file where you would set `ENDPOINT_KEY=user_provided` (e.g., `OPENAI_API_KEY=user_provided`).
## Step 3. Configure .env File
- **Edit your existing `.env` file** at the project root
- Copy `.env.example` and rename to `.env` if it doesn't already exist.
- According to the config above, the environment variable `OPENROUTER_KEY` is expected and should be set:
```bash
OPENROUTER_KEY=your_openrouter_api_key
```
**Notes:**
- As way of example, this guide assumes you have setup Ollama independently and is accessible to you at `http://host.docker.internal:11434`
- "host.docker.internal" is a special DNS name that resolves to the internal IP address used by the host.
- You may need to change this to the actual IP address of your Ollama instance.
- In a future guide, we will go into setting up Ollama along with LibreChat.
## Step 4. Run the App
- Now that your files are configured, you can run the app:
```bash
docker compose up
```
Or, if you were running the app before, you can restart the app with:
```bash
docker compose restart
```
> Note: Make sure your Docker Desktop or Docker Engine is running before executing the command.
## Conclusion
**That's it!** You have now configured **Custom Endpoints** for your LibreChat instance.