Discussion Technical SEO JavaScript

Our React SPA is completely invisible to AI crawlers - how do we fix this?

RE
ReactDev_Marcus · Frontend Lead
· · 82 upvotes · 10 comments
RM
ReactDev_Marcus
Frontend Lead · January 9, 2026

We built our site as a React SPA three years ago. Great user experience, fast interactions.

But now we’re realizing AI crawlers can’t see anything. When I check server logs:

  • GPTBot gets our empty HTML shell
  • PerplexityBot same thing
  • All our actual content is loaded via JavaScript

The problem:

  • Zero AI visibility
  • Competitors with traditional sites are getting cited
  • We have great content but AI can’t access it

Our current stack:

  • React 18 with React Router
  • Client-side rendering only
  • API-driven content loading
  • Hash-based routing (/#/page)

Questions:

  1. How bad is this problem really?
  2. What’s the fastest fix?
  3. Do we need to rebuild in Next.js?
  4. Are there solutions that don’t require major refactoring?

We can’t afford a full rebuild but need AI visibility.

10 comments

10 Comments

TE
TechSEO_Expert_Sarah Expert Technical SEO Consultant · January 9, 2026

This is a common and serious problem. Let me break it down:

How AI crawlers work:

Unlike Google (which can render JavaScript), most AI crawlers CANNOT:

  • GPTBot (OpenAI): No JavaScript execution
  • PerplexityBot: No JavaScript execution
  • ClaudeBot (Anthropic): No JavaScript execution

They see ONLY your initial HTML. For an SPA, that’s usually:

<div id="root"></div>
<script src="bundle.js"></script>

Zero content = zero AI visibility.

The good news: You don’t necessarily need a full rebuild. There are solutions that work with existing SPAs.

RM
RenderingOptions_Mike · January 9, 2026
Replying to TechSEO_Expert_Sarah

Your options from fastest to most comprehensive:

Option 1: Prerendering Service (Fastest)

  • Services like Prerender.io, Rendertron
  • Detects crawler user-agents
  • Serves cached HTML to crawlers
  • No code changes required
  • Implementation: Hours

Option 2: Dynamic Rendering

  • Serve different content based on user-agent
  • SPAs for users, static HTML for crawlers
  • Middleware implementation
  • Implementation: Days

Option 3: SSR Migration (Best long-term)

  • Migrate to Next.js/Remix
  • Full server-side rendering
  • Best for both AI and traditional SEO
  • Implementation: Weeks to months

Recommendation:

Start with prerendering NOW for immediate visibility. Plan SSR migration for long-term if resources allow.

SolutionTime to ImplementComplexityAI Visibility
PrerenderingHoursLowGood
Dynamic RenderingDaysMediumGood
SSR (Next.js)Weeks-MonthsHighExcellent
PE
PrerenderPro_Emma DevOps Engineer · January 9, 2026

Prerendering implementation details:

How it works:

  1. User-agent detection at edge/server
  2. If AI crawler detected, redirect to cached HTML
  3. If regular user, serve normal SPA

Quick setup with Prerender.io:

// Express middleware
const prerender = require('prerender-node');
app.use(prerender.set('prerenderToken', 'YOUR_TOKEN'));

Bot patterns to handle:

User-agent: GPTBot
User-agent: PerplexityBot
User-agent: ClaudeBot
User-agent: ChatGPT-User
User-agent: Googlebot

Results we’ve seen:

  • Indexing went from <25% to ~80% of pages
  • AI visibility within 2-3 weeks of implementation
  • No impact on user experience

Cost: Most prerendering services are $15-100/month depending on traffic.

This is your fastest path to AI visibility.

RM
ReactDev_Marcus OP Frontend Lead · January 9, 2026

Prerendering sounds like the quick win we need.

Question: You mentioned our hash-based URLs are a problem. How critical is fixing that?

UT
URLStructure_Tom Expert · January 8, 2026

Hash URLs are a SIGNIFICANT problem:

How crawlers see hash URLs:

  • Your URL: example.com/#/products/shoes
  • What crawler sees: example.com/
  • All hash routes = same page to crawlers

The fix - use History API:

// Before (hash routing)
<Route path="/#/products/:id" />

// After (browser history)
<Route path="/products/:id" />

// Configure React Router
<BrowserRouter>
  <Routes>
    <Route path="/products/:id" element={<Product />} />
  </Routes>
</BrowserRouter>

Server configuration needed:

# nginx - serve index.html for all routes
location / {
    try_files $uri $uri/ /index.html;
}

Priority: This is actually more important than prerendering. Hash URLs mean crawlers literally can’t distinguish your pages.

Fix URLs first, then implement prerendering.

SC
SSRMigration_Chris · January 8, 2026

If you’re considering Next.js migration eventually:

Benefits of SSR over prerendering:

  • Better for dynamic content
  • No stale cache issues
  • Improved initial load performance
  • Better Core Web Vitals
  • Future-proof for AI

Migration path from React to Next.js:

  1. Start with key pages only

    • Migrate highest-traffic pages first
    • Keep SPA for rest of site
    • Incremental adoption
  2. Use Next.js App Router

    • React Server Components
    • Built-in rendering options
    • Good React compatibility
  3. Maintain URL structure

    • Same routes, new rendering
    • No SEO disruption

Timeline estimate:

  • Simple site: 2-4 weeks
  • Medium complexity: 4-8 weeks
  • Large/complex: 2-4 months

Don’t wait to decide: Start with prerendering now, plan migration in parallel.

SL
StructuredDataSPA_Lisa · January 8, 2026

Structured data considerations for SPAs:

Current problem: Your JSON-LD probably loads via JavaScript too.

The fix: Include critical schema in initial HTML:

<head>
  <script type="application/ld+json">
  {
    "@context": "https://schema.org",
    "@type": "Organization",
    "name": "Your Company",
    ...
  }
  </script>
</head>

For dynamic pages: Prerendering should capture schema if implemented correctly.

Test your schema:

  1. View page source (not inspect)
  2. Check if schema is in initial HTML
  3. If not visible, crawlers can’t see it

Structured data helps AI understand your content even with prerendering.

RM
ReactDev_Marcus OP Frontend Lead · January 8, 2026

Here’s our implementation plan:

Week 1: Quick fixes

  1. Migrate from hash to browser history routing
  2. Configure server for SPA routing
  3. Move critical meta tags to initial HTML

Week 2-3: Prerendering

  1. Implement Prerender.io
  2. Configure for AI crawler user-agents
  3. Verify cached pages are correct
  4. Monitor crawl logs

Month 2+: Evaluate SSR

  1. Assess Next.js migration complexity
  2. Pilot with 1-2 key pages
  3. Decide on full migration timeline

Monitoring:

  1. Check server logs for AI crawler access
  2. Use Am I Cited to track AI visibility
  3. Test AI queries for our content

This gets us visible quickly while planning long-term.

MD
MonitorCrawlers_Dan · January 7, 2026

How to verify AI crawlers can see your content:

Check server logs for:

GPTBot - OpenAI
PerplexityBot - Perplexity
ClaudeBot - Anthropic
ChatGPT-User - ChatGPT browsing

Simulate what crawlers see:

  1. Disable JavaScript in browser
  2. View your pages
  3. That’s what crawlers see

After prerendering:

  • Check that crawlers get 200 responses
  • Verify HTML contains actual content
  • Test with curl:
curl -H "User-Agent: GPTBot" https://yoursite.com/page

Track crawl frequency:

  • Monitor how often AI bots visit
  • Good prerendering = more frequent crawls
  • Verify all important pages are cached

Crawl verification is how you know the fix worked.

CR
ContentStrategy_Rachel · January 7, 2026

Beyond rendering - content still matters:

Once crawlers can see your content:

  • Still need AI-optimized content structure
  • Clear headings and answers
  • Structured data implementation
  • Fresh, updated content

Don’t forget:

  • Rendering fixes only solve ACCESS
  • AI citation still requires quality content
  • Same optimization principles apply

The rendering fix gets you in the game. Content optimization wins the game.

Have a Question About This Topic?

Get personalized help from our team. We'll respond within 24 hours.

Frequently Asked Questions

Why can't AI crawlers see SPA content?
Most AI crawlers cannot execute JavaScript. When they visit an SPA, they only see the initial HTML shell without the dynamically loaded content. Since AI systems lack full browser environments, they cannot process JavaScript instructions that render the actual page content.
What's the best solution for making SPAs visible to AI?
Server-Side Rendering (SSR) is the gold standard. Frameworks like Next.js, Nuxt.js, and Remix render complete HTML on the server. For existing SPAs, prerendering services like Prerender.io can serve static HTML to crawlers without changing your architecture.
Does prerendering work for AI crawlers?
Yes, prerendering is highly effective. It generates static HTML snapshots that AI crawlers can access. Services detect crawler user-agents (GPTBot, PerplexityBot, ClaudeBot) and serve pre-rendered versions while regular users get the SPA experience.
How do URL structures affect SPA AI visibility?
Hash fragments (#) in URLs are problematic - AI crawlers treat them as a single page. Use the History API and pushState to create clean URLs like /products/item-name instead of /products#123. Each view needs a unique, descriptive URL.

Track Your SPA's AI Visibility

Monitor whether AI crawlers can see and cite your JavaScript-rendered content across AI platforms.

Learn more