Skip to content
SiteShiftCo

Client-side rendering (CSR)

An architecture in which the server sends a minimal HTML shell and a JavaScript bundle, and the browser builds the page in place by executing JavaScript.

Also known as: CSR

Client-side rendering (CSR) is an architecture in which the server sends a minimal HTML document plus one or more JavaScript files. The browser executes the JavaScript, which fetches data, builds the page’s content, and inserts it into the DOM. The browser does the rendering work that a server would do under SSR.

How CSR works

A typical CSR page load:

  1. The browser requests the page
  2. The server returns a small HTML file (often just <div id="app"></div> plus script tags)
  3. The browser downloads the JavaScript bundle
  4. The JavaScript runs, fetches any required data from APIs, and builds the page’s content
  5. The page becomes visible and interactive

Subsequent navigation within the application can be very fast because the JavaScript is already loaded; only data needs to be fetched.

Common CSR architectures

CSR is the default rendering model for traditional single-page applications (SPAs) built with frameworks like:

  • React (without SSR)
  • Vue (without SSR)
  • Angular
  • Svelte (without SSR)
  • Ember
  • Backbone, older Knockout-based applications

These frameworks all support SSR or static rendering as well; the distinction is whether the initial HTML is built on the server or in the browser.

CSR vs SSR vs static

AspectCSRSSRStatic
Where HTML is builtBrowserServerBuild server
Initial page weightSmall HTML + larger JSFull HTMLFull HTML
First Contentful PaintSlowerFasterFastest
In-app navigationOften very fastCan require server round-tripRequires re-fetching
SEO indexingRequires JS-aware crawlersReliableReliable
HostingStatic host or CDNApplication serverStatic host or CDN

Performance characteristics

CSR pages tend to:

  • Have a slower First Contentful Paint and Largest Contentful Paint than SSR or static
  • Have larger initial JavaScript payloads
  • Feel responsive after the initial load, especially for app-like navigation
  • Benefit from code splitting, lazy loading, and modern JavaScript build tools to reduce bundle size

Performance varies widely depending on bundle size, network conditions, and device capability.

SEO considerations

Search engines must execute JavaScript to see the content of a CSR page. Google’s crawler does execute JavaScript, but rendering is queued separately from initial crawling, which can delay indexing. Other search engines and many social media link previewers do not execute JavaScript at all.

For SEO-sensitive content, common patterns include:

  • Pre-rendering critical pages as static HTML (Next.js, Nuxt static export)
  • Using SSR for the initial load, then hydrating with client-side JavaScript
  • Server-rendering only the HTML head (titles, meta tags) and rendering the body client-side

When CSR tends to fit

  • Web applications behind a login (admin dashboards, internal tools)
  • Highly interactive apps where initial render speed is less important than in-app experience
  • Cases where SEO is not relevant (the content is gated or app-like)

When other approaches tend to fit better

  • Public, SEO-sensitive content (marketing sites, blogs, documentation)
  • Pages where time-to-content matters most (landing pages)
  • Devices with slower CPUs or networks where large JS bundles are costly

Hybrid approaches

Most modern frameworks blend CSR with SSR or static rendering. The HTML is server-rendered or pre-built; the JavaScript “hydrates” the page to enable interactivity. This combines fast initial load with rich in-app behavior.

Common misconceptions

  • “CSR is faster than SSR.” CSR is typically slower for first contentful paint but can feel faster for in-app navigation after load.
  • “All React apps use CSR.” React supports CSR, SSR, and static rendering; modern frameworks like Next.js and Remix default to SSR or hybrid.
  • “CSR is bad for SEO.” It can be problematic, but Google handles JavaScript content; the main risks are indexing latency and non-Google crawlers.