Skip to content

CONCEPT Cited by 1 source

Rendering-strategy crawl-efficiency tradeoff

Definition

The rendering-strategy crawl-efficiency tradeoff is the observation that different rendering strategies (SSG / ISR / SSR / CSR) produce materially different Google-facing capabilities, and the right strategy for a page depends on which capabilities matter most for that page.

Vercel + MERJ's 2024-08-01 study closes with an explicit capability-matrix table that names six Google-facing axes and grades each strategy.

The capability matrix (canonical table)

Capability SSG ISR SSR CSR
Crawl efficiency (how quickly and effectively Google can access, render, and retrieve pages) Excellent Excellent Very Good Poor
Discovery (finding new URLs to crawl)* Excellent Excellent Excellent Average
Rendering completeness (no errors, no blocked resources, no render failures) Robust Robust Robust Might fail
Rendering time (how long Google takes to fully render and process the page) Excellent Excellent Excellent Poor
Link structure evaluation (how Google assesses links) After rendering After rendering After rendering After rendering — may be missing if render fails
Indexing (storing and organising content) Robust Robust Robust Might not be indexed if rendering fails

* An updated sitemap.xml largely eliminates time-to-discovery differences across strategies. See concepts/sitemap.

(Source: sources/2024-08-01-vercel-how-google-handles-javascript-throughout-the-indexing-process.)

Why SSG / ISR / SSR cluster together

For three of the four strategies, the initial HTML response contains the indexable content. Googlebot's crawl-stage has full access to:

  • All text content (indexing — "Robust")
  • All link targets in <a href>s (discovery — "Excellent")
  • noindex / canonical / structured data signals (at the right stage — pre-render)
  • Enough asset URLs to avoid render-time surprises (rendering completeness — "Robust")

WRS still runs (universal rendering) but its output on these pages is effectively the initial HTML plus trivial client-side hydration. Rendering time — the part that costs the most in the queue — is cheap.

Why CSR is the outlier

CSR (SPA with an empty shell and all content rendered client- side):

  • Initial HTML is empty / near-empty. Content is absent from the crawl-stage view. Link discovery finds only what's in the shell.
  • Rendering completeness can fail. If a JS file is blocked by robots.txt, if an API call fails, if an error bubbles past the error boundary, the render produces an empty or broken DOM. Content doesn't get indexed.
  • Rendering time is worst. Full Chromium execution, full sub-resource fetch, full JS execution, full async-work settlement — all required for indexable content.
  • Discovery degrades to "Average" because new URLs only appear after render; without a sitemap they wait for the queue.

The study's experimental evidence: on nextjs.org (a well- operated site), 100 % of pages including CSR pages rendered. The tradeoff is not "CSR won't index" — it's "CSR has the highest failure surface area", and for large sites pays a crawl-budget tax.

Sitemap as the equaliser

An up-to-date sitemap.xml with <lastmod> tags largely eliminates the discovery gap across rendering strategies. All four strategies benefit from a sitemap; CSR benefits most because it's the one where link discovery is weakest.

See concepts/sitemap for the canonical rendering-strategy- neutrality observation.

ISR / streaming SSR specifically

Neither ISR (concepts/incremental-static-regeneration) nor streaming SSR (concepts/streaming-ssr) is disadvantaged:

  • ISR: serves cached HTML on cache hit; on cache miss, renders on-demand server-side and caches. Googlebot sees server-rendered HTML either way. The source study confirms ISR pages on nextjs.org render fine.
  • Streaming SSR: flushes HTML chunks as <Suspense> boundaries resolve. Googlebot captures the full stream including the streamed boundaries — confirmed empirically for React Server Components.

What this doesn't argue

  • CSR is not universally bad. For highly interactive apps with poor SEO incentives (dashboards, authenticated tools), CSR is fine; the tradeoff only bites when SEO matters.
  • SSR is not always better than CSR. SSR pays server cost per page render; on pages with no SEO or personalisation value, the cost may not buy anything.
  • Per-page strategy choice is the right granularity. Many sites mix strategies: SSG for marketing, ISR for product catalogue, SSR for user-specific pages, CSR for the dashboard.

Seen in

Last updated · 476 distilled / 1,218 read