docs add truefoundry.mdx

This commit is contained in:
Rishiraj1719
2025-08-07 15:42:29 +05:30
parent 856db68be5
commit b1cd964e8f
6 changed files with 72 additions and 1 deletions

View File

@@ -387,7 +387,8 @@
"en/development/models-integration/ollama",
"en/development/models-integration/litellm",
"en/development/models-integration/gpustack",
"en/development/models-integration/aws-bedrock-deepseek"
"en/development/models-integration/aws-bedrock-deepseek",
"en/development/models-integration/truefoundry"
]
},
{
@@ -2941,6 +2942,10 @@
"source": "/development/models-integration/aws-bedrock-deepseek",
"destination": "/en/development/models-integration/aws-bedrock-deepseek"
},
{
"source": "/development/models-integration/truefoundry",
"destination": "/en/development/models-integration/truefoundry"
},
{
"source": "/development/migration",
"destination": "/en/development/migration/migrate-to-v1"

View File

@@ -0,0 +1,66 @@
---
title: "Integration with TrueFoundry AI Gateway"
---
TrueFoundry provides an enterprise-ready [AI Gateway](https://www.truefoundry.com/ai-gateway) and integrates seamlessly with Dify, providing enterprise-grade AI features including cost tracking, security guardrails, and access controls.
TrueFoundry's AI Gateway routes all LLM calls through the Gateway to ensure your AI applications are secure, compliant, and cost-effective.
## Prerequisites
Before integrating Dify with TrueFoundry, ensure you have:
1. **TrueFoundry Account**: Create a [Truefoundry account](https://www.truefoundry.com/register) and follow the instructions in our [Gateway Quick Start Guide](https://docs.truefoundry.com/gateway/quick-start)
2. **Dify Installation**: Set up Dify using either the [cloud version](https://dify.ai/) or [self-hosted deployment](https://github.com/langgenius/dify) with Docker
## Integration Steps
This guide assumes you have Dify installed and running, and have obtained your TrueFoundry AI Gateway base URL and authentication token.
### Step 1: Access Dify Model Provider Settings
1. Log into your Dify workspace (cloud or self-hosted).
2. Navigate to **Settings** and go to **Model Provider**:
<Frame>
<img src="/images/model provider select diffy.png" />
</Frame>
### Step 2: Install OpenAI-API-Compatible Provider
1. In the Model Provider section, look for **OpenAI-API-compatible** and click **Install**.
2. Configure the OpenAI-API-compatible provider with your TrueFoundry details:
<Frame>
<img src="/images/open api diffy.png" />
</Frame>
Fill in the following configuration:
- **Model Name**: Enter your TrueFoundry model ID (e.g., `openai-main/gpt-4o`)
- **Model display name**: Enter a display name (e.g., `Gpt-4o`)
- **API Key**: Enter your TrueFoundry API Key
- **API endpoint URL**: Enter your TrueFoundry Gateway base URL (e.g., `https://internal.devtest.truefoundry.tech/api/llm/api/inference/openai`)
- **model name for API endpoint**: Enter the endpoint model name (e.g., `openai-main/gpt-4o`)
<Frame>
<img src="/images/new-code-snippet.png" />
</Frame>
### Step 3: Save and Test Your Configuration
1. Click **Save** to apply your configuration in Dify.
2. Create a new application or workflow to test the integration:
3. Test the integration by creating a simple LLM workflow to verify that Dify is successfully communicating with TrueFoundry's AI Gateway.
<Frame>
<img src="/images/result diffy.png" />
</Frame>
Your Dify workspace is now integrated with TrueFoundry's AI Gateway and ready for building AI applications, workflows, and agents.

Binary file not shown.

After

Width:  |  Height:  |  Size: 416 KiB

BIN
images/new-code-snippet.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 554 KiB

BIN
images/open api diffy.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 118 KiB

BIN
images/result diffy.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 376 KiB