diff --git a/docs/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/tools-agent.md b/docs/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/tools-agent.md index 9d27a81bc..e9f4b4057 100644 --- a/docs/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/tools-agent.md +++ b/docs/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/tools-agent.md @@ -181,6 +181,14 @@ Refine the Tools Agent node's behavior using these options: --8<-- "_snippets/integrations/builtin/cluster-nodes/langchain-root-nodes/binary-images.md" +### Enable Streaming + +When enabled, the AI Agent sends data back to the user in real-time as it generates the answer. This is useful for long-running generations. This is enabled by default. + +/// info | Streaming requirements +For streaming to work, your workflow must use a trigger that supports streaming responses, such as the [Chat Trigger](/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/index.md) or [Webhook](/integrations/builtin/core-nodes/n8n-nodes-base.webhook/index.md) node with **Response Mode** set to **Streaming**. +/// + ## Templates and examples Refer to the main AI Agent node's [Templates and examples](/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/index.md#templates-and-examples) section. diff --git a/docs/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook.md b/docs/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook.md index 4b40e266f..a40199f87 100644 --- a/docs/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook.md +++ b/docs/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook.md @@ -46,6 +46,7 @@ Select **Add Option** to view and set the options. - **Response Code**: Set the [response code](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status) to use. - **Response Headers**: Define the response headers to send. - **Put Response in Field**: Available when you respond with **All Incoming Items** or **First Incoming Item**. Set the field name for the field containing the response data. +- **Enable Streaming**: When enabled, sends the data back to the user using streaming. Requires a trigger configured with the **Response mode** **Streaming**. ## How n8n secures HTML responses diff --git a/docs/integrations/builtin/core-nodes/n8n-nodes-base.webhook/index.md b/docs/integrations/builtin/core-nodes/n8n-nodes-base.webhook/index.md index ed4059858..40b937d65 100644 --- a/docs/integrations/builtin/core-nodes/n8n-nodes-base.webhook/index.md +++ b/docs/integrations/builtin/core-nodes/n8n-nodes-base.webhook/index.md @@ -88,6 +88,7 @@ Refer to [Webhook credentials](/integrations/builtin/credentials/webhook.md) for * **Immediately**: The Webhook node returns the response code and the message **Workflow got started**. * **When Last Node Finishes**: The Webhook node returns the response code and the data output from the last node executed in the workflow. * **Using 'Respond to Webhook' Node**: The Webhook node responds as defined in the [Respond to Webhook](/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook.md) node. +* **Streaming response**: Enables real-time data streaming back to the user as the workflow processes. Requires nodes with streaming support in the workflow (for example, the [AI agent](/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/index.md) node). ### Response Code diff --git a/docs/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/index.md b/docs/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/index.md index 7ea957268..926ef2208 100644 --- a/docs/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/index.md +++ b/docs/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/index.md @@ -85,6 +85,7 @@ Use this option when building a workflow with steps after the agent or chain tha /// note | Using Response Nodes This mode replaces the 'Using Respond to Webhook Node' mode from version 1.2 of the Chat Trigger node. /// +* **Streaming response**: Enables real-time data streaming back to the user as the workflow processes. Requires nodes with streaming support in the workflow (for example, the [AI agent](/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/index.md) node). #### Require Button Click to Start Chat @@ -121,6 +122,7 @@ Use this option when building a workflow with steps after the agent or chain tha /// note | Using Response Nodes This mode replaces the 'Using Respond to Webhook Node' mode from version 1.2 of the Chat Trigger node. /// +* **Streaming response**: Enables real-time data streaming back to the user as the workflow processes. Requires nodes with streaming support enabled. ## Templates and examples diff --git a/docs/workflows/streaming.md b/docs/workflows/streaming.md new file mode 100644 index 000000000..4b386addb --- /dev/null +++ b/docs/workflows/streaming.md @@ -0,0 +1,39 @@ +--- +#https://www.notion.so/n8n/Frontmatter-432c2b8dff1f43d4b1c8d20075510fe4 +title: Streaming responses +description: Build a workflow with streaming responses +contentType: howto +--- + +# Streaming responses + +/// info | Feature availability +Available on all plans from [version 1.105.2](/release-notes.md#n8n11052). +/// + +Streaming responses let you send data back to users as an AI Agent node generates it. This is useful for chatbots, where you want to show the user the answer as it's generated to provide a better user experience. + +You can enable streaming using either: + +- The [Chat Trigger](/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/index.md) +- The [Webhook node](/integrations/builtin/core-nodes/n8n-nodes-base.webhook/index.md) + +In both cases, set the node's **Response Mode** to **Streaming**. + + +## Configure nodes for streaming + +To stream data, you need to add nodes to the workflow that support streaming output. Not all nodes support this feature. + +1. Choose a node that supports streaming, such as: + - [AI agent](/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/index.md) + - [Respond to Webhook](/integrations/builtin/core-nodes/n8n-nodes-base.respondtowebhook.md) +2. You can disable streaming in the options of these nodes. By default, they stream data whenever the executed trigger has its `Response Mode` set to `Streaming response`. + + +## Important information + +Keep in mind the following details when configuring streaming responses: + +- **Trigger**: Your trigger node must support streaming and have streaming configured. Without this, the workflow behaves according to your response mode settings. +- **Node configuration**: Even with streaming enabled on the trigger, you need at least one node configured to stream data. Otherwise, your workflow will send no data. diff --git a/nav.yml b/nav.yml index dbfba063c..bc55b239c 100644 --- a/nav.yml +++ b/nav.yml @@ -62,6 +62,7 @@ nav: - Templates: workflows/templates.md - Sharing: workflows/sharing.md - Settings: workflows/settings.md + - Streaming responses: workflows/streaming.md - Workflow history: workflows/history.md - Workflow ID: workflows/workflow-id.md - Sub-workflow conversion: workflows/subworkflow-conversion.md diff --git a/styles/config/vocabularies/default/accept.txt b/styles/config/vocabularies/default/accept.txt index 49a3ab74c..8d4c7f8e5 100644 --- a/styles/config/vocabularies/default/accept.txt +++ b/styles/config/vocabularies/default/accept.txt @@ -43,7 +43,7 @@ Calendly Capterra cartesian Chargebee -[Cc]hatbot +[Cc]hatbots? ChromeOS Citrix Clearbit