This commit is contained in:
Timothy Jaeryang Baek
2025-04-08 13:53:49 -07:00
parent 4e85efc52a
commit 5aaa5b67d7

View File

@@ -1,163 +0,0 @@
---
sidebar_position: 3
title: "🕸️ Network Diagrams"
---
Here, we provide clear and structured diagrams to help you understand how various components of the network interact within different setups. This documentation is designed to assist both macOS/Windows and Linux users. Each scenario is illustrated using Mermaid diagrams to show how the interactions are set up depending on the different system configurations and deployment strategies.
## Mac OS/Windows Setup Options 🖥️
### Ollama on Host, Open WebUI in Container
In this scenario, `Ollama` runs directly on the host machine while `Open WebUI` operates within a Docker container.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Component(ollama, "Ollama", "Listening on port 11434")
}
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
### Ollama and Open WebUI in Compose Stack
Both `Ollama` and `Open WebUI` are configured within the same Docker Compose stack, simplifying network communications.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Boundary(b2, "Compose Stack") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
Component(ollama, "Ollama", "Listening on port 11434")
}
}
}
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
### Ollama and Open WebUI, Separate Networks
Here, `Ollama` and `Open WebUI` are deployed in separate Docker networks, potentially leading to connectivity issues.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Boundary(b2, "Network A") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Boundary(b3, "Network B") {
Component(ollama, "Ollama", "Listening on port 11434")
}
}
}
Rel(openwebui, ollama, "Unable to connect")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
### Open WebUI in Host Network
In this configuration, `Open WebUI` utilizes the host network, which impacts its ability to connect in certain environments.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
Person(user, "User")
Boundary(b1, "Docker Desktop's Linux VM") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
}
Rel(user, openwebui, "Unable to connect, host network is the VM's network")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
## Linux Setup Options 🐧
### Ollama on Host, Open WebUI in Container (Linux)
This diagram is specific to the Linux platform, with `Ollama` running on the host and `Open WebUI` deployed inside a Docker container.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Boundary(b1, "Container Network") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Component(ollama, "Ollama", "Listening on port 11434")
}
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
### Ollama and Open WebUI in Compose Stack (Linux)
A set-up where both `Ollama` and `Open WebUI` reside within the same Docker Compose stack, allowing for straightforward networking on Linux.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Boundary(b1, "Container Network") {
Boundary(b2, "Compose Stack") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
Component(ollama, "Ollama", "Listening on port 11434")
}
}
}
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
### Ollama and Open WebUI, Separate Networks (Linux)
A scenario in which `Ollama` and `Open WebUI` are in different Docker networks under a Linux environment, which could hinder connectivity.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Boundary(b2, "Container Network A") {
Component(openwebui, "Open WebUI", "Listening on port 8080")
}
Boundary(b3, "Container Network B") {
Component(ollama, "Ollama", "Listening on port 11434")
}
}
Rel(openwebui, ollama, "Unable to connect")
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
### Open WebUI in Host Network, Ollama on Host (Linux)
An optimal layout where both `Open WebUI` and `Ollama` use the hosts network, facilitating seamless interaction on Linux systems.
```mermaid
C4Context
Boundary(b0, "Hosting Machine - Linux") {
Person(user, "User")
Component(openwebui, "Open WebUI", "Listening on port 8080")
Component(ollama, "Ollama", "Listening on port 11434")
}
Rel(openwebui, ollama, "Makes API calls via localhost", "http://localhost:11434")
Rel(user, openwebui, "Makes requests via listening port", "http://localhost:8080")
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
```
Each setup addresses different deployment strategies and networking configurations to help you choose the best layout for your requirements.