Mabel

Preventing AI Hallucinations in Customer Support: A Practical Approach

Written by
Nouran Smogluk
January 28, 2025
17
min read
A book cover titled "Preventing AI Hallucinations in Customer Support: A Practical Approach"

If you’re a customer support leader feeling nervous about AI hallucinations, you’re not alone.

We’ve all seen those terrible headlines about AI support gone wrong: bots making up policies out of thin air or quoting prices that don't exist. Not exactly the kind of press coverage you want for your support team, right?

While these concerns are totally valid, AI hallucinations aren’t the deal-breaker they might seem to be.

Think of them more like any other business risk you already handle. Just like you have systems in place to prevent human errors in customer service (training, quality checks, approval processes), you can build similar guardrails for AI support.

Here are some practical, proven strategies for keeping AI hallucinations in check.

What is AI hallucination?

A hallucination in AI refers to the AI system fabricating information.

It might not seem very different from a person making up or guessing the answer to a question when they don’t actually know it. However, the difference is that AI systems typically make a statement confidently, and the answer will sound plausible even though it’s entirely made up.

Here are a few examples of AI hallucinations in customer support:

  • Inventing specific details: For example, an AI might inform a customer that they can return any item within 90 days for a refund when your policy is only 30 days.
  • Mixing up key contextual information: For instance, it could tell a customer they can pick up an order at your Boston store when your business doesn’t have a physical location.
  • Connecting concepts incorrectly: An AI might incorrectly claim that your premium subscription includes features that belong to a different plan.

Why do AI hallucinations happen?

The core issue is that AI systems are designed to provide coherent, natural-sounding responses.

When they’re asked questions they don’t have answers to, instead of saying "I don't know" or "I'm not sure," they tend to generate responses that seem logical based on patterns in their training data.

This behavior stems from their architecture. LLMs (Large Language Models) are pattern-matching systems, not knowledge databases with clear boundaries of what they know and don’t know.

Hallucinations are especially tricky in customer support because the stakes are high. When an AI hallucinates about a return policy or service feature, it isn’t just providing incorrect information. It’s potentially:

  • Creating specific commitments your company will have to handle.
  • Setting incorrect customer expectations, leading to frustration.
  • Violating regulatory requirements or company policies.
  • Damaging customer trust when incorrect information is discovered.

Can AI hallucinations be fixed?

The bad news is that you can never prevent hallucinations 100% of the time—just as you can’t stop a person from making a mistake every time.

But there’s good news: You can significantly reduce the risk of hallucinations. Here’s how.

How to reduce AI hallucinations

1. Provide contextual information

Just as you wouldn’t put a human agent on the support floor without proper training, an AI agent needs clear contextual information about your business, such as:

  • Your mission.
  • Products and services.
  • Customer demographics.
  • Tone of voice.
  • Locations and operating policies.

In Siena’s Persona Studio, you can provide AI agents with this foundational information, allowing them to make decisions based on context.

2. Integrate AI with company data

AI needs real-time access to your company data. By integrating with tools like your knowledge base, internal documents, and previous customer interactions, you can reduce hallucinations significantly.

Siena allows seamless integration with knowledge sources, ensuring accurate responses. Plus, it includes features like automations to override conflicting information, further minimizing risks.

3. Turn negatives into positives

Instead of instructing your AI on what it shouldn’t do, teach it what it should do. For example:

  • Don’t say: “Never tell a customer you can’t find their order.”
  • Do say: “If an order cannot be found, ask for alternative verification methods like email address or phone number.”

4. Avoid contradictory instructions

Instructions like “Always process refunds under $40” and “Never refund items marked as final sale” might confuse an AI. Use a clear hierarchy instead:

  1. Check if the item is marked final sale.
  2. If not, check if the order value is under $40.
  3. If both conditions are met, process the refund.

Think of it as an “if this, then that” flow.

5. Have clear escalation paths

For complex cases, design escalation processes that include:

  • Clear triggers for when to escalate to a human agent.
  • Instructions on what information to collect before escalating.
  • Transparent communication with customers about the transfer.

6. Thoroughly test the AI

AI improves with interaction. Use tools like Siena’s Playground and Test Runs to simulate customer scenarios and identify weak spots in your AI responses.

chrome-capture-2024-10-29.gif

7. Monitor and iterate on AI responses

Once your AI is live, regularly review its performance:

  • Conduct quality assurance reviews of AI conversations.
  • Use metrics like CXP (Customer Experience Per Automation) to track success.
  • Create feedback loops with your team to identify opportunities for improvement.

Implement AI agents thoughtfully

Hallucinations don’t have to derail your AI strategy. With proper implementation, robust data integration, and ongoing monitoring, AI can become a powerful tool for your customer experience team.

Ready to see it in action? Book a free demo with Siena today and discover how thoughtful AI implementation can elevate your customer support.

Did my order ship already? I need to change my shipping address.
I need to pause my subscription as I’ll be traveling to Hawaii.
How can I best prepare it?
Is it possible to change to another shampoo and conditioner bars for my subscription?
I don't drink coffee. Does it come in capsule form? Or different flavors?
I just received my order but the self draining soap dish was not in the box.
🇺🇸 Nick Johnson
Hey! Can I return my order?
🇮🇹 Francesca Bonetti
Come fare il latte macchiato?