Why Gemini Sees Your JavaScript Content and ChatGPT Doesn't
TL;DR: Google's crawlers execute JavaScript, so Gemini can see client-rendered content. ChatGPT's OAI-SearchBot, ClaudeBot, and PerplexityBot mostly rely on static HTML. If your content loads via JavaScript after page load, it's likely invisible to most AI search engines. The fix: server-side rendering or static generation for any content you want AI systems to cite.
There's a visibility gap that most people don't realize exists.
Your React app looks great. Content renders beautifully in the browser. Google indexes it fine. But when someone asks ChatGPT about your product, you get nothing. No citation. No mention.
The culprit isn't your content quality or your AEO strategy. It's how your content gets delivered to the page.
The Rendering Gap
When a web crawler visits your page, it receives HTML. What happens next depends on the crawler.
JavaScript-executing crawlers (like Googlebot) download the HTML, then run your JavaScript in a headless browser environment. They wait for your React components to mount, your API calls to complete, and your content to populate the DOM. Then they index what they see.
Static crawlers (like most AI bots) download the HTML and index it directly. They don't run JavaScript. If your content isn't in the initial HTML response, it doesn't exist to them.
This is why the same page can be fully indexed by Google but completely invisible to ChatGPT.
Which AI Crawlers Execute JavaScript?
Here's the current landscape:
| Crawler | JavaScript Execution | Notes |
|---|---|---|
| Googlebot | Yes | Full rendering queue, executes JS |
| Google-Extended | Yes | Same rendering as Googlebot |
| OAI-SearchBot | Limited | Primarily static HTML |
| GPTBot | Limited | Training crawler, static-focused |
| ClaudeBot | No | Relies on Brave Search index |
| PerplexityBot | Limited | Some JS support, not comprehensive |
| Bingbot | Yes | Full rendering, powers Copilot |
| AppleBot | Partial | Basic JS support |
Google invested years building Caffeine, their rendering infrastructure that executes JavaScript at scale. Most AI companies haven't made that investment. They're optimizing for speed and scale, not rendering complexity.
Why This Matters for AI Visibility
Consider a typical single-page application:
<!-- What the crawler receives -->
<html>
<body>
<div id="root"></div>
<script src="/bundle.js"></script>
</body>
</html>
To Googlebot, this page eventually becomes your full product description, feature list, and pricing table after JavaScript executes.
To OAI-SearchBot, this page is empty. There's nothing to index. Nothing to cite.
This isn't hypothetical. We've seen companies with solid Google rankings get zero AI citations because their content lives entirely in client-rendered components.
Gemini's Advantage
Google's Gemini has a significant advantage here. Because it shares infrastructure with Google Search, it has access to the rendered content that Googlebot already indexed.
When Gemini answers a question, it can pull from pages that required JavaScript execution to render. ChatGPT, Claude, and Perplexity are working from a more limited index—pages where the content was available in the initial HTML.
This creates an uneven playing field. A question asked to Gemini might surface your content. The same question to ChatGPT might not.
The Framework Problem
Modern JavaScript frameworks ship with different rendering strategies, and the defaults often hurt AI visibility:
React (Create React App): Client-side only. Empty HTML shell. Invisible to most AI crawlers.
Next.js: Supports SSR, SSG, and ISR. Good defaults if you use them. Pages Router was SSR by default. App Router uses Server Components by default—better for AI visibility.
Nuxt/Vue: Similar to Next.js. SSR available, but you need to use it.
Gatsby: Static generation by default. Good for AI crawlers if content is in the build.
SvelteKit: SSR by default. Generally AI-crawler friendly.
The pattern: frameworks that render on the server (or at build time) work well with AI crawlers. Frameworks that rely on client-side rendering don't.
How To Check Your AI Visibility
Before fixing anything, verify the problem exists for your site:
1. View Source vs. Inspect Element
Open your page in Chrome. Right-click and "View Page Source." This is what crawlers see (roughly).
Now right-click and "Inspect Element." This is the rendered DOM after JavaScript.
If key content appears in Inspect but not View Source, you have a rendering problem.
2. Test With curl
curl -s https://yoursite.com/important-page | grep "your key phrase"
If your content doesn't appear, static crawlers won't see it.
3. Use Google's Rich Results Test
Google's Rich Results Test shows both the raw HTML and the rendered HTML. Compare them to see what content requires JavaScript.
4. Check AI Responses Directly
Ask ChatGPT with web search enabled about your product. Ask Gemini the same question. If Gemini cites you but ChatGPT doesn't, rendering is likely the issue.
The Fixes
Server-Side Rendering (SSR)
Render your pages on the server before sending HTML to the client. The content is in the initial response. All crawlers see it.
In Next.js App Router, Server Components do this by default:
// This content is in the HTML response
export default async function ProductPage() {
const product = await getProduct()
return <ProductDetails product={product} />
}
Static Site Generation (SSG)
Pre-render pages at build time. Even better for crawlers since there's no server computation on each request.
// Next.js - generates static HTML at build
export async function generateStaticParams() {
const products = await getAllProducts()
return products.map(p => ({ slug: p.slug }))
}
Incremental Static Regeneration (ISR)
For content that changes frequently, ISR gives you static benefits with periodic updates:
export const revalidate = 3600 // Regenerate every hour
Hybrid Approach
Not everything needs SSR. Static crawlers don't interact with your app—they just need the content. Render critical content server-side, keep interactive features client-side:
// Server Component - content in HTML
async function ProductInfo({ id }) {
const product = await getProduct(id)
return <div>{product.description}</div>
}
// Client Component - interactive, not needed for indexing
'use client'
function AddToCartButton({ productId }) {
return <button onClick={() => addToCart(productId)}>Add to Cart</button>
}
What About Hydration?
Server-side rendering with hydration is fine. The content exists in the initial HTML (crawlers see it), then JavaScript "hydrates" it for interactivity (users get the full experience).
The key is that the content is present before JavaScript runs.
Dynamic Content and API Calls
If your page fetches content from an API after load, that content is invisible to static crawlers.
Bad for AI visibility:
'use client'
function ProductList() {
const [products, setProducts] = useState([])
useEffect(() => {
fetch('/api/products').then(r => r.json()).then(setProducts)
}, [])
return products.map(p => <Product key={p.id} {...p} />)
}
Better:
// Server Component - data fetched before HTML sent
async function ProductList() {
const products = await fetch('https://api.example.com/products').then(r => r.json())
return products.map(p => <Product key={p.id} {...p} />)
}
The Meta Tags Problem
A related issue: meta tags generated by JavaScript.
If your title, description, or Open Graph tags are set client-side, crawlers won't see them. This affects not just AI visibility but social sharing and traditional SEO.
// Bad - client-side meta tags
'use client'
useEffect(() => {
document.title = product.name
}, [product])
// Good - server-side meta tags (Next.js)
export async function generateMetadata({ params }) {
const product = await getProduct(params.id)
return { title: product.name, description: product.description }
}
Priority List
If you're fixing rendering issues, prioritize:
- Landing pages - These are what you want AI to cite
- Product/service pages - Direct answers to "what is X" queries
- Documentation - Technical content AI often references
- Blog posts - Long-tail query targets
Lower priority:
- Dashboards and app interfaces (users, not crawlers)
- Forms and checkout flows
- User-generated content feeds
The Longer Term
AI crawlers are likely to improve their JavaScript rendering over time. OpenAI, Anthropic, and Perplexity have the resources and incentive to build better indexing infrastructure.
But that's not today's reality. Today, if you want visibility across all AI search engines—not just Gemini—your content needs to be in the HTML.
The good news: fixing this for AI crawlers also improves traditional SEO, page load performance, and accessibility. Server-rendered content loads faster and works without JavaScript.
Frequently Asked Questions
Does Google share its index with Gemini?
Not directly shared, but they use similar infrastructure. Gemini has access to rendered content from Google's crawling and indexing systems, which include JavaScript-executed pages.
Will ChatGPT eventually render JavaScript?
Possibly. OpenAI could build rendering infrastructure for OAI-SearchBot, but it's expensive and complex at scale. For now, assume they don't.
Is this why my competitors rank in ChatGPT and I don't?
It could be one factor. If they're using server-side rendering and you're not, their content is visible to ChatGPT while yours isn't. Check both sites' HTML source to compare.
Does this affect Perplexity?
Perplexity has some JavaScript rendering capability but it's not comprehensive. Server-rendered content is still more reliably indexed.
Should I stop using React?
No. Use React with server-side rendering (Next.js, Remix) or static generation. The framework isn't the problem—it's client-only rendering that causes issues.
How do I know if my Next.js app is rendering server-side?
Check View Source on your deployed site. If your content is in the HTML, you're good. If you see an empty div with just script tags, you're client-rendering.
Want to test whether AI systems are citing your content? Try our Citation Analyzer to see how your pages perform in ChatGPT search results.
Related Articles
Is AIO, AEO, LLMO, GEO Different from SEO? Yes, It Really Is
Why optimizing for AI search is fundamentally different from traditional SEO, and what industry leaders are saying about the shift to Answer Engine Optimization.
How ChatGPT Citations Work: A Complete Guide
Learn how ChatGPT cites sources in its responses, why your content might not be showing up, and what you can do to improve your visibility in AI search results.
From SEO to AEO: Why Marketing Engineers Are Winning in 2026
Search is moving from Google links to ChatGPT answers. Here's why the new strategy isn't keywords. It's Answer Engine Optimization (AEO) and community authority.