Overview
The Direct Reply Node sends a response to the user without invoking a language model. It is designed for cases where the response content is known in advance — greetings, error messages, confirmation notices, static instructions, or templated messages that incorporate workflow variables. Because it bypasses the LLM entirely, the Direct Reply Node executes instantly with zero token cost, making it ideal for high-frequency, low-complexity responses.Configuration
| Parameter | Type | Default | Description |
|---|---|---|---|
reply_mode | string | "text" | Response format: "text" or "markdown" |
reply_type | string | "static" | Content source: "static" (literal text) or "variable" (from context) |
reply_content | string | "" | The response content or variable reference. Allows empty string. |
append_to_history | boolean | true | Whether to add this reply to the conversation history |
Reply Types
- Static
- Variable
Static Reply
Returns a fixed string. Supports variable interpolation with{{variable_name}} syntax.Reply Modes
| Mode | Description | Rendering |
|---|---|---|
text | Plain text response | Displayed as-is, no formatting applied |
markdown | Markdown-formatted response | Rendered with headings, lists, code blocks, links, etc. |
Empty Reply Content
Thereply_content field explicitly allows empty strings. This is useful in two scenarios:
- Silent acknowledgment — The node executes and appends to history without displaying visible output, useful for internal workflow tracking.
- Conditional paths — An empty reply on one branch paired with a substantive reply on another branch.
Conversation History
Whenappend_to_history is true, the reply is added to the conversation’s message history as an assistant message. This means downstream AI Agent nodes will see this reply as part of the conversation context.
Set append_to_history to false when:
- The reply is a transient status message (e.g., “Processing your request…”)
- The reply contains internal metadata not relevant to the LLM
- You want to avoid polluting the conversation context
Use Cases
Greeting Messages
Welcome users with a consistent, branded message before routing to an AI agent.
Error Handling
Return friendly error messages on failed branches without consuming LLM tokens.
Confirmation Notices
Acknowledge user actions (“Your file has been uploaded successfully”) instantly.
Fallback Responses
Provide a default response when no conditions match or the knowledge base returns no results.
Example: Conditional Greeting
A workflow that greets the user differently based on the time of day:Example: Error Fallback
Use a Direct Reply Node as the default output of a Condition Node to handle missing search results:Best Practices
Prefer Direct Reply over AI Agent for static content
Prefer Direct Reply over AI Agent for static content
If the response text is known at design time, always use a Direct Reply Node instead of an AI Agent Node. It is faster, cheaper, and deterministic.
Use markdown mode for rich responses
Use markdown mode for rich responses
When the reply includes links, lists, code snippets, or formatted text, set
reply_mode to "markdown" for proper rendering in the chat interface.Be intentional with conversation history
Be intentional with conversation history
Only append to history when the reply adds meaningful context for future LLM calls. Transient messages like “Please wait…” should not be appended.