Discussion Technical SEO JavaScript

Is JavaScript killing our AI visibility? AI crawlers seem to miss our dynamic content

FR
FrontendDev_Alex · Lead Developer at SaaS Company
· · 142 upvotes · 10 comments
FA
FrontendDev_Alex
Lead Developer at SaaS Company · January 6, 2026

Our site is built on React with client-side rendering. We have great content but terrible AI visibility.

What’s happening:

  • Content loads dynamically via JavaScript
  • Traditional Google rankings are fine (Googlebot renders JS)
  • AI visibility is near zero
  • Checked server logs - AI bots visit but content isn’t being cited

My suspicion: AI crawlers aren’t executing JavaScript, so they’re seeing empty shells.

Questions:

  • Do AI crawlers actually execute JavaScript?
  • What’s the technical fix?
  • How do we maintain our modern stack but become AI-visible?

Looking for dev-focused solutions here.

10 comments

10 Comments

TM
TechSEO_Marcus Expert Technical SEO Engineer · January 6, 2026

Your suspicion is correct. Most AI crawlers DO NOT execute JavaScript.

How different crawlers handle JS:

CrawlerJavaScript ExecutionWhat They See
GPTBot (ChatGPT)NoRaw HTML only
PerplexityBotNoRaw HTML only
ClaudeBotNoRaw HTML only
Google-ExtendedNoRaw HTML only
GooglebotYesRendered page

Why this matters: If your content is rendered by client-side JS, AI crawlers see:

<div id="app"></div>

Not your actual content.

The solution hierarchy:

  1. Server-Side Rendering (SSR) - Content in initial HTML response
  2. Static Site Generation (SSG) - Pre-built HTML pages
  3. Prerendering service - Service renders JS for bots
  4. Hybrid rendering - SSR for key content, client for interactions

Your React app can implement any of these. Next.js makes SSR/SSG straightforward.

FA
FrontendDev_Alex OP · January 6, 2026
Replying to TechSEO_Marcus
We’re considering Next.js migration. Is SSR enough, or do we need specific optimizations for AI crawlers?
TM
TechSEO_Marcus Expert · January 6, 2026
Replying to FrontendDev_Alex

SSR/Next.js implementation for AI visibility:

Basic requirement: Content must be in the initial HTML response. getServerSideProps or getStaticProps in Next.js achieves this.

Additional optimizations:

  1. Schema in server-rendered HTML

    // In page component
    <script type="application/ld+json">
      {JSON.stringify(schemaData)}
    </script>
    
  2. Critical content early in DOM

    • Main content in first 50KB
    • Answer-first structure
    • Key information before interactive elements
  3. robots.txt allowing AI bots

    User-agent: GPTBot
    Allow: /
    
    User-agent: PerplexityBot
    Allow: /
    
  4. Fast initial response

    • AI bots don’t wait for slow servers
    • Target <500ms TTFB

Testing:

curl -A "GPTBot" https://yoursite.com/page

If content is in the response, you’re good. If not, SSR isn’t working correctly.

Migration is worth it. We’ve seen clients go from 0 to significant AI visibility after implementing SSR.

NT
NextJSDev_Tom Full-Stack Developer · January 5, 2026

We did this exact migration. Here’s the practical experience:

Before (React SPA):

  • Client-side rendering
  • Content via API calls
  • AI visibility: Zero

After (Next.js SSR):

  • Server-side rendering for all content pages
  • Static generation for documentation
  • AI visibility: Growing weekly

Implementation tips:

  1. Use App Router with Server Components Default is SSR - content just works

  2. Data fetching server-side

    // This runs on server, content in HTML
    async function Page() {
      const data = await fetch('...');
      return <Article data={data} />;
    }
    
  3. Avoid ‘use client’ for content components Only use client components for interactivity

  4. Metadata API for SEO/AI

    export const metadata = {
      title: '...',
      description: '...',
    };
    

Migration effort: About 3 weeks for a medium-sized site. Worth every hour.

Results: First AI citations appeared within 6 weeks of launching SSR site.

PE
PreRenderPro_Elena · January 5, 2026

If migration isn’t feasible, prerendering is an option:

What prerendering does:

  • Service renders your JS for bot requests
  • Returns full HTML to crawlers
  • Real users still get your SPA

Popular services:

  • Prerender.io
  • Rendertron
  • Puppeteer-based solutions

Implementation: Middleware detects bot user agents and routes to prerender service.

Pros:

  • No codebase changes
  • Works with any framework
  • Quick implementation

Cons:

  • Additional cost
  • Latency for bot requests
  • Caching complexity
  • Third-party dependency

When to use:

  • Large legacy codebase
  • Migration not feasible short-term
  • Need quick AI visibility fix

When NOT to use:

  • New projects (just use SSR)
  • Small sites (migration is easier)
  • Budget-constrained (prerendering has costs)

Prerendering is a bridge solution, not an ideal long-term strategy.

FJ
FrameworkComparison_James · January 5, 2026

Framework options for AI-friendly sites:

FrameworkDefault RenderingAI VisibilityEffort
Next.jsSSR/SSGExcellentMedium
Nuxt.jsSSR/SSGExcellentMedium
GatsbySSGExcellentLow
RemixSSRExcellentMedium
SvelteKitSSR/SSGExcellentLow
Pure ReactCSRPoor-
Pure VueCSRPoor-
AngularCSR (default)Poor-

Recommendation by situation:

  • New project: Next.js, Nuxt, or SvelteKit
  • React migration: Next.js
  • Vue migration: Nuxt
  • Content-heavy site: Gatsby or Astro
  • Blog/docs: Hugo, Eleventy, or Astro

For AI visibility, anything with SSR/SSG works. Pure client-side rendering doesn’t.

HR
HybridApproach_Rachel Frontend Architect · January 4, 2026

Hybrid rendering for complex apps:

The challenge: Some parts of your app NEED client-side rendering (dashboards, interactive tools). But content needs SSR.

Solution: Hybrid rendering

  1. Content pages: Full SSR

    • Blog posts, documentation
    • Marketing pages
    • FAQs and knowledge base
  2. Interactive features: Client-side

    • Dashboards
    • Forms and tools
    • User-specific content

Next.js App Router makes this easy:

  • Server Components for content
  • Client Components for interactivity
  • Mix freely on same page

Example structure:

// Page is server-rendered
export default function Page() {
  return (
    <>
      <ServerRenderedContent /> {/* AI sees this */}
      <ClientInteractiveWidget /> {/* AI doesn't need this */}
    </>
  );
}

The principle: Anything you want AI to see: Server render. Everything else: Client-side is fine.

TK
TestingBot_Kevin · January 4, 2026

Testing if your content is AI-visible:

Method 1: View Source

  • Right-click → View Page Source
  • If content is there = AI can see it
  • If only <div id="root"></div> = AI can’t see it

Method 2: Disable JavaScript

  • Browser DevTools → Settings → Disable JavaScript
  • Reload page
  • If content disappears = AI can’t see it

Method 3: curl test

curl -A "GPTBot" https://yoursite.com/page | grep "your content"

If content returns, you’re good.

Method 4: Google Rich Results Test

  • Tests rendered content
  • Shows what Googlebot sees
  • Similar to what AI bots would see

After implementing SSR: Run these tests again. Content should be visible in all methods.

Pro tip: Set up monitoring to catch regressions. SSR can break without obvious symptoms.

PL
PerformanceImpact_Lisa · January 4, 2026

Performance considerations with SSR:

SSR adds server load:

  • Each request needs server rendering
  • More compute than serving static files
  • Caching becomes critical

Mitigation strategies:

  1. Static generation where possible

    • Blog posts, docs = Static
    • Dynamic content = SSR
  2. Incremental Static Regeneration (ISR)

    • Rebuild static pages on schedule
    • Best of both worlds
  3. Edge rendering

    • Render at CDN edge
    • Faster TTFB globally
  4. Caching layers

    • Full-page caching
    • Component-level caching

The trade-off: SSR costs more in compute but gains AI visibility. For most businesses, the visibility is worth the infrastructure investment.

Monitoring: Track TTFB after implementing SSR. If it’s slow, bots may time out before getting content.

FA
FrontendDev_Alex OP Lead Developer at SaaS Company · January 3, 2026

This confirmed the problem and gave clear solutions. Our action plan:

Immediate (This week):

  1. Audit current rendering with curl tests
  2. Identify content pages most important for AI visibility
  3. Review robots.txt for AI bot access

Short-term (Next quarter):

  1. Begin Next.js migration for content pages
  2. Implement SSR/SSG for blog, docs, and marketing pages
  3. Keep dashboard/app as client-rendered

Implementation approach:

  1. Start with highest-value content pages
  2. Test AI visibility after each batch
  3. Use ISR for frequently-updated content
  4. Monitor TTFB throughout

Technical decisions:

  • Next.js App Router with Server Components
  • Static generation for documentation
  • SSR for blog and marketing
  • Client components only where needed

Testing plan:

  1. curl tests after each deployment
  2. View source verification
  3. Monitor AI citations over time
  4. Track which pages get cited

Key insight: Client-side rendering = invisible to AI. SSR/SSG = visible. The migration is non-optional for AI visibility.

Thanks everyone - clear path forward now!

Have a Question About This Topic?

Get personalized help from our team. We'll respond within 24 hours.

Frequently Asked Questions

Does JavaScript affect AI crawling?
Yes, significantly. Most AI crawlers do not execute JavaScript. Content rendered only by client-side JavaScript is invisible to GPTBot, PerplexityBot, and other AI crawlers. They see only the initial HTML response.
What's the solution for JavaScript-heavy sites?
Server-Side Rendering (SSR), Static Site Generation (SSG), or prerendering services ensure content is in the initial HTML response. This makes content visible to AI crawlers that don’t execute JavaScript.
Do all AI crawlers have the same JavaScript limitations?
Most AI crawlers don’t execute JavaScript. GPTBot, PerplexityBot, and ClaudeBot request HTML and parse it directly. Googlebot does execute JavaScript (for traditional search), but Google AI Overviews may still prefer static content.
How can I test if AI crawlers can see my content?
View your page source (not DevTools) and check if content is present. Disable JavaScript and reload - if content disappears, AI crawlers can’t see it. Use curl to fetch your page and check the response.

Monitor Your Content's AI Visibility

Track whether your content is being discovered and cited by AI platforms, regardless of your tech stack.

Learn more