Discussion Customer Service Support Strategy

Customer service teams: Are customers using AI before contacting you? We're seeing a major shift in support queries

SU
SupportLead_Jennifer · Head of Customer Support
· · 65 upvotes · 10 comments
SJ
SupportLead_Jennifer
Head of Customer Support · December 31, 2025

Something’s changed in our support queue. Over the last 6 months, I’ve noticed:

The shifts we’re seeing:

  • Fewer simple “how do I” questions
  • More complex, edge-case queries
  • Customers arriving with information they got from AI
  • Sometimes customers have WRONG information from AI

Examples:

  • “ChatGPT told me your product can do X” (it can’t)
  • “I already tried the steps AI suggested, didn’t work” (we can verify they did try)
  • Questions about features that would be helpful but don’t exist

My questions:

  • Are other support teams seeing this?
  • How do you handle customers with AI-provided misinformation?
  • Should we be monitoring what AI tells customers about us?
  • How do we adapt our support strategy?
10 comments

10 Comments

CM
CXDirector_Mark Expert Director of Customer Experience · December 31, 2025

Jennifer, this is happening across the industry. We’ve been studying it.

The new customer journey:

Old: Problem → Google → Company Help Center → Contact Support New: Problem → ChatGPT → (Maybe) Company Help Center → Contact Support

What this changes:

  1. Simple queries deflected - AI answers the easy stuff
  2. Complex queries remain - AI can’t handle edge cases
  3. Pre-researched customers - They’ve already tried things
  4. Misinformed customers - AI gave wrong info

The data from our support:

Metric20242025Change
Total tickets10,0008,500-15%
Complex tickets3,0004,500+50%
Avg handle time8 min12 min+50%
First contact resolution75%65%-10%

Fewer tickets, but each one takes longer because the easy ones are gone.

SJ
SupportLead_Jennifer OP · December 31, 2025
Replying to CXDirector_Mark

That data matches our experience. The +50% on complex tickets is real.

How are you handling the misinformation cases? When customers say “ChatGPT told me…” and it’s wrong?

CM
CXDirector_Mark · December 31, 2025
Replying to SupportLead_Jennifer

Handling AI misinformation:

  1. Don’t blame the customer - They trusted a tool, that’s reasonable
  2. Acknowledge the source - “I understand ChatGPT suggested that…”
  3. Correct gently - “Actually, our product works differently…”
  4. Provide documentation - Link to official resources
  5. Report patterns - Track common misconceptions for content team

Our process:

We created a “AI misconception log” that agents add to when they see patterns. Common ones get escalated to marketing/content to address.

Examples we’ve addressed:

  • “AI says we have unlimited storage” → Updated our FAQ
  • “AI says we integrate with X” → We added explicit content about what we DON’T integrate with
  • “AI says our pricing is $X” → Updated structured data with current pricing
KR
KnowledgeManager_Rachel Knowledge Base Manager · December 30, 2025

Knowledge management perspective on the AI customer service shift:

Your help content is now training AI.

What’s in your help center, documentation, and FAQs is what AI learns about your product. If your content is:

  • Incomplete → AI fills gaps with guesses
  • Outdated → AI provides old information
  • Unclear → AI misinterprets

The solution:

Treat your help content as AI training data. It needs to be:

  1. Comprehensive (cover all features)
  2. Current (update regularly)
  3. Clear (unambiguous language)
  4. Correct (factually accurate)
  5. Explicit about limitations (what you DON’T do)

What we changed:

We added sections like:

  • “What [Product] does NOT do”
  • “Common misconceptions about [Product]”
  • “Differences between [Product] and [Competitor]”

This helps AI provide accurate information BEFORE customers contact support.

ST
SupportOps_Tom · December 30, 2025

Operations perspective on the shift:

Staffing implications:

If simple tickets decrease and complex tickets increase, you need:

  • Fewer tier 1 agents
  • More tier 2/3 specialists
  • Different training (complex problem solving vs. process following)
  • Longer handle time expectations

How we adapted:

  1. Reduced tier 1 team by 20%
  2. Promoted best performers to tier 2
  3. Changed success metrics (handle time → resolution quality)
  4. Created “AI escalation” workflow for misinformation cases

The cost reality:

Lower volume but higher complexity = roughly same total cost BUT customer satisfaction increased because fewer simple queries means less queue wait time for complex ones.

CL
ContentStrategist_Linda Expert · December 30, 2025

Content strategy to reduce AI misinformation:

The problem: AI is a black box - you can’t directly correct it. But you CAN influence what it learns.

What we do:

  1. Comprehensive FAQ - Every common question answered clearly
  2. Explicit limitations - What we DON’T do, clearly stated
  3. Pricing structured data - Current pricing in schema markup
  4. Feature descriptions - Clear, unambiguous language
  5. Comparison content - How we differ from competitors

Monitoring:

We use Am I Cited to track what AI tells users about us. When we spot misinformation:

  1. Create/update content addressing it
  2. Add to FAQ if it’s a common question
  3. Wait 4-8 weeks for AI to learn the correction
  4. Monitor for improvement

It’s not instant, but you can systematically correct AI’s understanding of your product.

AK
AIImplementer_Kevin · December 29, 2025

We actually built AI into our support workflow. Here’s the impact:

AI-assisted support model:

  1. Customer starts chat
  2. AI bot handles first contact
  3. If AI can’t resolve, escalates to human
  4. Human sees AI’s attempted solutions

Results:

MetricBefore AI BotAfter AI Bot
Human ticket volume100%40%
Customer satisfaction78%82%
First response time4 hoursInstant
Human handle time8 min15 min

The key insight:

By the time a customer reaches a human, they’ve already:

  • Described their problem to AI
  • Had AI attempt solutions
  • Confirmed what doesn’t work

Human agents start with full context. More complex, but more efficient.

CS
CustomerVoice_Sarah · December 29, 2025

Customer research perspective:

We surveyed 500 customers about their AI usage before contacting support:

BehaviorPercentage
Used AI first62%
Tried AI-suggested solutions48%
AI answered their question35%
AI gave wrong information18%
Mentioned AI to support agent41%

The “AI-first” customer segment:

They’re typically:

  • Tech-comfortable
  • Prefer self-service
  • More frustrated when they DO contact support (because “simple” solutions failed)
  • More specific in their problem descriptions

Implication:

When they reach you, they’re often further along in frustration but also better at describing the problem.

SM
SupportTrainer_Mike · December 28, 2025

Training perspective on handling AI-influenced customers:

New skills our agents need:

  1. AI-awareness - Understanding what AI can/can’t do
  2. Misconception handling - Correcting without shaming
  3. Context gathering - “What have you already tried?”
  4. Documentation skills - Logging AI-related issues
  5. Escalation judgment - Knowing when AI misinformation needs content update

Training modules we added:

  • “Understanding the AI-first customer”
  • “Handling AI misinformation gracefully”
  • “What AI tells customers about our product” (based on Am I Cited monitoring)
  • “Logging patterns for content improvement”

The cultural shift:

Agents now see themselves as part of a feedback loop. Their observations about AI misinformation flow to content team, which updates docs, which improves AI accuracy.

SJ
SupportLead_Jennifer OP Head of Customer Support · December 28, 2025

This thread validated what I suspected and gave me actionable strategies. Key takeaways:

The reality:

  • AI is deflecting simple queries (15% fewer tickets)
  • Complex queries are increasing (+50%)
  • Handle time is increasing (simpler stuff is gone)
  • Misinformation creates new challenges

Strategies to implement:

Short-term:

  1. Create “AI misconception log” for agents
  2. Train team on handling AI-influenced customers
  3. Adjust success metrics away from pure handle time
  4. Start monitoring what AI says about us

Medium-term:

  1. Update help content to be “AI training friendly”
  2. Add explicit content about what we DON’T do
  3. Create feedback loop from support to content team
  4. Consider AI-assisted support model

Long-term:

  1. Restructure team for complex query handling
  2. Shift hiring toward problem-solving skills
  3. Build systematic AI information monitoring

The survey data showing 62% use AI first is significant. This isn’t a trend - it’s the new normal.

Thanks everyone for the operational and strategic insights.

Have a Question About This Topic?

Get personalized help from our team. We'll respond within 24 hours.

Frequently Asked Questions

How is AI affecting customer service queries?
AI is changing customer service in several ways: customers arrive pre-researched with information from ChatGPT, simple queries are resolved before contacting support, complex queries become the norm, customers sometimes have incorrect AI-provided information that needs correction, and overall ticket volume patterns are shifting toward more complex issues.
Are customers using AI before contacting support?
Yes, increasingly customers research via AI before contacting support. Many customers now arrive with specific information, solutions they’ve already tried, or questions that AI couldn’t answer. This changes support dynamics - agents handle more complex queries while simple ones are deflected to AI self-service.
Should companies monitor what AI tells customers about them?
Yes, monitoring AI responses about your company is important for customer service. If AI provides incorrect information, customers arrive confused or with wrong expectations. Understanding what AI tells customers helps support teams prepare for common misconceptions and ensures the AI information ecosystem about your brand is accurate.

Monitor Your Brand in Customer Research

Track what AI tells customers about your company before they contact support. Understand the information customers receive from ChatGPT and Perplexity.

Learn more

Our support content gets zero AI citations - what are we doing wrong?

Our support content gets zero AI citations - what are we doing wrong?

Community discussion on optimizing support content for AI visibility. Support and content teams share strategies for making help documentation citable by AI sea...

7 min read
Discussion Support Content +1