🦙 docs: ollama.mdx (#278)

The baseURL option should not have the /chat/completions part for Ollama models to work with Agents.
This commit is contained in:
Odrec
2025-05-08 20:41:26 +02:00
committed by GitHub
parent c42a41810e
commit f7dc16f095

View File

@@ -20,7 +20,7 @@ description: Example configuration for Ollama
- name: "Ollama"
apiKey: "ollama"
# use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
baseURL: "http://localhost:11434/v1/chat/completions"
baseURL: "http://localhost:11434/v1/"
models:
default: [
"llama2",