Step 1: Define the ticket resolution process

Before building a workflow, it’s important to define the process you want your AI agent to follow. Think of this as writing down your runbook or SOP for a specific type of ticket.

  1. Define the ticket type you want to handle
    Example: subscription cancellation, payment issue, product how-to.

  2. Outline the step-by-step process

    • What actions should the agent take? (e.g. knowledge search, API call, send response)
    • If actions depend on conditions, define those conditions clearly (e.g. “If subscription is active, proceed with cancellation”)
    • Specify when to escalate to a human
  3. Map out decision points
    Identify where the workflow needs to branch or make decisions, and what should happen in each case.

Once your process is clear, you can translate it into a Duckie workflow.

Step 2: Build a workflow

1. Initialization

The first thing to do when creating a new workflow is to navigate to the Workflows tab on the Duckie web app. Once there, simply press ADD.

2. Creating a new node

To create a new node, drag an edge from a node and select from the available node types.

Conditional Node

Conditional nodes allow you to branch out the decision tree based on xxxx.

There are two condition types:

  1. AI-based use an LLM to determine whether a condition is met or not
  2. Rule-based use deterministic comparisons

AI-based condition To create an LLM Condition, select the LLM Condition type, complete the Prompt field, and add any necessary Inputs. To learn more about inputs, see the Node Inputs section.

Parameters:

Rule-based condition To create a Comparator Condition, select the Comparator Condition and fill in the following fields:

Parameters:

  1. Input: The context that the comparator will use to compare
  2. Operator: The operator used for the comparisson (e.g. Equals, Contains, etc.)
  3. Value: The value to compare the input to

Action Node

Action nodes are used to perform specific actions / tasks within your workflow.

A number of “out-of-the-box” actions are available, including the core Duckie actions that are used as the base building-blocks for workflows. These include:

  1. Respond: Sends a message to the end-user. This can be LLM generated or pre-defined. Parameters:

  2. Knowledge Search: Performs a search of your agent’s knowledge base to find information. Parameters:

  3. Escalate: Sends an escalation message to your support team, as well as an informational message to the end-user. This can be LLM generated or pre-defined.

Parameters:

  1. LLM Call: Performs an LLM call given your provided prompt and context. This can then be used as the input to subsequent actions.

Parameters:

  1. App-specific actions:

Parameters:

  1. Custom actions:

Parameters:

Action Node Parameters

Each action node will have a list of parameters that are used to perform the action. When you select an Action, you will see a list of available parameters. Each parameter has inputs (learn more), which are used as inputs to the parameter.

Workflow Node

Workflow nodes call child workflows. They are used to separate workflow logic into modularized components which can then be referenced in multiple places.

To create a workflow node, select the Workflow Node type, and select the child workflow from the dropdown.

Node Inputs

Node inputs are used to reference information within a node.

Memory

Memory inputs are used to access details of a specific agent run. Available options are:

  1. Conversation: Contains the current Conversation
  2. Guidance: Contains any Guidance items specific to the run (learn more here)
  3. Run history: Contains execution information about the run, such as previous nodes

Node Ouput

The Node Output references the outputs of previous nodes. For example, you may have an LLM Call action node that you want to use as an input to a subsequent node

Value

Value lets you input a value that will be used deterministically.

LLM Extract

The LLM Extract uses an LLM to generate an input value based on the context provided to it. For example, you may use LLM Extract with the conversation as context as the input to a Subject field.