mirror of
https://github.com/n8n-io/n8n-docs.git
synced 2026-03-27 09:28:43 +07:00
Docs: updated outdated references to Ollama docs (#4210)
This commit is contained in:
@@ -38,7 +38,7 @@ The default **Base URL** is `http://localhost:11434`, but if you've set the `OLL
|
||||
|
||||
If you're connecting to Ollama through authenticated proxy services (such as [Open WebUI](https://docs.openwebui.com/getting-started/api-endpoints/#-ollama-api-proxy-support)) you must include an API key. If you don't need authentication, leave this field empty. When provided, the API key is sent as a Bearer token in the `Authorization` header of the request to the Ollama API.
|
||||
|
||||
Refer to [How do I configure Ollama server?](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server) for more information.
|
||||
Refer to [How do I configure Ollama server?](https://github.com/ollama/ollama/blob/main/docs/faq.mdx#how-do-i-configure-ollama-server) for more information.
|
||||
|
||||
### Ollama and self-hosted n8n
|
||||
|
||||
@@ -46,4 +46,4 @@ If you're self-hosting n8n on the same machine as Ollama, you may run into issue
|
||||
|
||||
For this setup, open a specific port for n8n to communicate with Ollama by setting the `OLLAMA_ORIGINS` variable or adjusting `OLLAMA_HOST` to an address the other container can access.
|
||||
|
||||
Refer to Ollama's [How can I allow additional web origins to access Ollama?](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama) for more information.
|
||||
Refer to Ollama's [How can I allow additional web origins to access Ollama?](https://github.com/ollama/ollama/blob/main/docs/faq.mdx#how-can-i-allow-additional-web-origins-to-access-ollama) for more information.
|
||||
|
||||
Reference in New Issue
Block a user