1. How to Write a Good Prompt

Rule of thumb:
When creating prompts for LLM conditions or actions, think of how you’d explain the task to a new hire. Give clear, step-by-step instructions. Include real-world examples when possible to guide the AI.

Example scenarios

  • Detect refund request
    Prompt:

    “Check if the user message suggests that the user wants a refund. The message may not directly say “refund” but could contain keywords like “cancel subscription”, “unknown charge”, or could imply a desire to reverse a payment.

    Examples:

    ‘I got charged twice in April. Can you check and cancel one of the charges?’

    ‘I canceled my subscription in March but still see a Stripe payment this month. Can I get some help?’

    ‘Why was I billed twice on my credit card?’”

  • Detect product how-to question
    Prompt:

    “Determine if the user is asking how to use a product feature. Examples: ‘How do I reset my password?’ ‘Where can I change my email address?’”


2. Gather User Input for API Calls

When building workflows that call APIs, make sure your agent collects the required input first.

  1. Conditional Node
    Check if the conversation already contains the required input (e.g. invoice ID, customer email).

  2. API Call Node
    If you have the input, pass it to your API tool to perform the action. Toggle Extract User Input on, with the prompt: “Extract user email from the conversation. Examples: abc@gmail.com, xyz@hotmail.com.” Else, call the Gather User Input Node.

  3. Gather User Input Node
    If input is missing, ask the user (e.g. “Can you provide your email?”).