Picture of Wyatt Johnson

Wyatt Johnson

Full-stack developer


Built with at
Picture of Wyatt Johnson

Wyatt Johnson

Full-stack developer

Partial Prerendering

From error-based detection to promise-driven architecture for optimal web performance

September 1, 2025

At last year's Next.js Conf, we announced a new rendering paradigm, Partial Prerendering (or PPR). Building on Next.js's strong support for data and route caching, PPR optimizes the way Next.js serves static parts of a page quickly while still allowing server rendering of components that require access to request data.

Since announcing experimental support for the feature, we've iterated on some of the core mechanisms that Next.js uses to detect request data access through a new experimental flag, dynamicIO.

Understanding the Problem Space

Before diving into PPR's implementation details, let's establish why this feature matters. In modern web development, we face a fundamental tension between performance and functionality:

  • Static rendering delivers content blazingly fast from the edge but lacks access to request-specific data
  • Dynamic rendering provides full functionality but requires server computation, increasing time to first byte

This dichotomy forces developers into an all-or-nothing decision at the route level. Even accessing a single cookie marks an entire page as dynamic, sacrificing the performance benefits of static generation.

Core Web Vitals: The North Star

PPR's design centers around optimizing Core Web Vitals, particularly:

  • Time to First Byte (TTFB): How quickly the server responds with initial content
  • Largest Contentful Paint (LCP): When the main content becomes visible

The goal is sub-2.5-second LCP times, though ideally we're targeting hundreds of milliseconds. PPR achieves this by serving static shells immediately while fetching dynamic content in parallel.

Legacy Partial Prerendering

When we announced PPR in Next.js 14, it relied on throwing errors to signal to Next.js that request data was accessed. We assumed it would work as well as other APIs we have, such as redirect() and notFound(), which still use error throwing as a signal today. The issue arose for developers migrating existing applications who now had errors being thrown in unexpected places.

1import { setTimeout } from "node:timers/promises"
2import { cookies } from "next/headers"
3
4async function getPosts() {
5 for (let i = 0; i < 3; i++) {
6 try {
7 // In Next.js 14, this threw an error during
8 // prerendering.
9 const session = cookies().get("session")
10 // ...
11 } catch (err) {
12 if (i === 2) {
13 throw err
14 }
15
16 // Here we have a timeout added when a
17 // fetch failed.
18 await setTimeout(500 + i * 500)
19 }
20 }
21}
22
23export default async function Page() {
24 const posts = await getPosts()
25
26 // ...
27}
28

This was a common pattern that we found even within popular database drivers handling unreliable backends. The retry loop with backoffs would eventually throw the error we used to signal request data access (cookies() in this case), but only after running through the set of timeouts, dramatically slowing down builds.

Additionally, we discovered cases where fallbacks were returned when operations failed or, worse, the error was shadowed:

1async function checkAuthorization() {
2 try {
3 const session = cookies().get("session")
4 // ...
5 } catch (err) {
6 throw new Error("AUTHORIZATION_CHECK_FAILED")
7 }
8}
9

In these cases, Next.js couldn't determine which components accessed request data, as it relied on the surrounding suspense boundary to catch the special error that was thrown.

We initially tried to solve this problem by introducing the unstable_rethrow(err) API:

1import { unstable_rethrow } from "next/navigation"
2
3async function checkAuthorization() {
4 try {
5 const session = cookies().get("session")
6 // ...
7 } catch (err) {
8 unstable_rethrow(err)
9
10 throw new Error("AUTHORIZATION_CHECK_FAILED")
11 }
12}
13

This would re-throw internal errors that Next.js uses for signaling. However, this introduced a new problem: it required developers to remember to add it. Every call site that could potentially generate a Next.js internal error would need this re-throw function inserted before any other throw in the application code. We felt this wasn't an acceptable developer experience.

We were left with two options. Since we could detect that request data was accessed, but not where, we could either mark the entire page as dynamic and provide a warning to developers, or rethink the detection mechanism. The first option wasn't ideal, as it might eliminate the entire static shell on revalidation, depending on the application's logic. We decided on the latter approach.

Modern Partial Prerendering

We needed a way to suspend the processing of user code when they tried to access request data. Errors were our first choice, but the problems with nested try/catch blocks meant that these signals that were meant to inform Next.js about request data being used would be unreliable, and possibly lead to longer than expected build times due to the retry logic.

But if we can't throw an error, then what primitives do we have left?

The answer? Promises.

Developers are already adept at managing Promises. With the advent of React Server Components, for the first time, users could write async components, allowing data fetching to be colocated, all while using the native APIs that they were already familiar with, such as fetch.

The Promise-Based Approach

The key insight was to leverage JavaScript's inherent asynchronous behavior. Instead of throwing errors that could be caught and mishandled, we needed our request APIs to return promises that would never resolve during a static prerender. This approach provides several advantages:

1// Before - synchronous with error throwing
2function cookies() {
3 if (isPrerendering) {
4 throw new PostponeError("Cannot access cookies during prerendering")
5 }
6 return getCookies()
7}
8
9// After - asynchronous with promises
10async function cookies() {
11 if (isPrerendering) {
12 // Return a promise that never resolves
13 return new Promise(() => {})
14 }
15 return getCookies()
16}
17

This shift required updating all request APIs to be asynchronous:

1// Next.js 15 Request APIs
2await cookies()
3await headers()
4await connection() // replaces unstable_noStore()
5

Leveraging the Node.js Event Loop

The real innovation comes from understanding how the Node.js event loop works. Node.js is single-threaded, meaning it can only run one block of synchronous code at a time. When it needs to perform I/O operations, it offloads them to native code.

Here's the crucial insight: I/O that's non-deterministic can't complete in the same Task.

Next.js takes advantage of this by using a clever scheduling mechanism:

1import { prerender } from 'react-dom/static.edge'
2
3const controller = new AbortController()
4await { prelude, postponed } = new Promise((resolve, reject) => {
5 let result
6 setImmediate(() => {
7 try {
8 result = prerender(<App />, { signal: controller.signal })
9 resolve(result)
10 } catch (err) {
11 reject(err)
12 }
13 })
14
15 setImmediate(() => {
16 controller.abort()
17 resolve(result)
18 })
19})
20

This approach schedules a task to prerender the application, then immediately schedules another task to abort it. Node.js processes all microtasks (like resolving already-resolved promises) before moving to the next Task, allowing synchronous operations to complete while preventing asynchronous I/O from executing.

The Prospective Render Strategy

One challenge remained: how to include external data in the static shell? Data from a CMS or database requires async fetches that won't complete within a single event loop task.

The solution is the prospective render:

1// Using Next.js cache APIs
2async function getData() {
3 'use cache'
4 const res = await fetch('...')
5 return res.json()
6}
7
8// Or with unstable_cache
9const getData = unstable_cache(async () => {
10 const res = await fetch('...')
11 return res.json()
12})
13

During build time, Next.js performs a prospective render to fill cache entries for any cacheable data. This render's output is discarded, but the caches are primed. When the actual prerender occurs, these cached values can be resolved synchronously within the first task, allowing them to be included in the static shell.

Partially Static vs. Fully Static Pages

This leads to two categories of pages in PPR:

Partially Static Pages

Pages that access uncached data or request information:

1async function Page() {
2 const data = await fetch('...') // No cache
3 return (
4 <Suspense fallback={<Skeleton />}>
5 <Component data={data} />
6 </Suspense>
7 )
8}
9

These require the resume render to stream in dynamic content.

Fully Static Pages

Pages where all async operations are cached:

1async function Page() {
2 const data = await getData() // Cached
3 return <Component data={data} />
4}
5

These can be served entirely from the edge without any origin invocation.

The Streaming Architecture

PPR uses a single HTTP response stream to deliver both static and dynamic content. This approach minimizes roundtrips and optimizes performance:

  1. Static shell streams immediately from the edge
  2. Resume render starts concurrently at the origin
  3. Dynamic content streams in as it becomes ready
  4. Single response contains everything

This timing is critical - while the browser downloads static resources (CSS, JS) hinted by the static shell, the server is already rendering dynamic content.

Practical Implementation

Let's examine how PPR works in practice with a real-world example:

1import { Suspense } from "react"
2import { cookies } from "next/headers"
3
4async function getCart() {
5 const jar = await cookies()
6 // Fetch cart data based on session
7 return fetchCartData(jar.get("session"))
8}
9
10async function Cart() {
11 const cart = await getCart()
12 return <CartDisplay items={cart.items} />
13}
14
15export default function Page() {
16 return (
17 <div>
18 <Header /> {/* Static - included in shell */}
19
20 <Suspense fallback={<CartSkeleton />}>
21 <Cart /> {/* Dynamic - streamed later */}
22 </Suspense>
23
24 <ProductListing /> {/* Static - included in shell */}
25 </div>
26 )
27}
28

In this example:

  • Header and ProductListing are static and included in the initial shell
  • Cart accesses cookies, making it dynamic
  • The CartSkeleton is part of the static shell
  • The actual cart content streams in once available

Enabling Dynamic I/O

Currently, PPR requires two experimental flags:

1// next.config.js
2module.exports = {
3 experimental: {
4 ppr: true,
5 dynamicIO: true
6 }
7}
8

The dynamicIO flag enables the promise-based detection mechanism. The plan is to eventually remove this flag as PPR stabilizes.

Dynamic Path Parameters

A special consideration for pathname-based APIs: with traditional rendering, dynamic paths like /products/[id] would force the entire page to be dynamic. With PPR and async params:

1type Props = {
2 params: Promise<{ id: string }>
3}
4
5export default function Page({ params }: Props) {
6 return (
7 <Suspense fallback={<ProductDetailsSkeleton />}>
8 <ProductDetails params={params} />
9 </Suspense>
10 )
11}
12

This enables fallback static shells that work for any product ID, providing static-like performance even for dynamic routes.

Performance Implications

The performance benefits of PPR are substantial:

  • Reduced TTFB: Static shells serve immediately from edge locations
  • Improved LCP: Critical content loads faster through parallel fetching
  • Better perceived performance: Users see content immediately instead of waiting
  • Efficient caching: Static portions cache effectively at CDN edge nodes

Developer Experience Considerations

While PPR offers significant performance benefits, it's important to understand its impact on development:

Benefits

  • No API changes required - existing Suspense boundaries just work
  • Granular control over static/dynamic boundaries
  • Automatic optimization without manual configuration

Current Limitations

  • Experimental status means potential breaking changes
  • Requires careful Suspense boundary placement
  • Debugging can be more complex with streaming responses
  • Limited platform support outside Vercel and self-hosting

Future Developments

CDN Interoperability

Currently, the single-stream approach works on Vercel and self-hosted deployments, but broader CDN support is planned once PPR stabilizes. This will involve creating standardized protocols for CDNs to handle the static shell and dynamic streaming pattern.

Production Readiness

While PPR represents a significant advancement, it remains experimental. The Next.js team is working towards:

  • Removing the dynamicIO flag requirement
  • Improving developer experience for larger codebases
  • Expanding platform compatibility
  • Stabilizing the API surface

Conclusion

Partial Prerendering represents a paradigm shift in how we think about web application rendering. By combining static shells with streaming dynamic content, PPR offers:

  • Optimal TTFB through edge-served static content
  • Improved LCP via parallel data fetching
  • Better UX with immediate static content display
  • Developer ergonomics using familiar React patterns

The journey from error-based detection to promise-based mechanisms showcases the thoughtful evolution of this feature. While challenges remain around platform compatibility and production readiness, PPR represents the future of web application rendering - one where developers no longer need to choose between speed and functionality.

As we move forward, the goal is clear: make PPR the default rendering model for web applications, bringing together the best of static site generation and dynamic delivery without compromising on either. The technical foundations are solid, the performance benefits are clear, and the developer experience continues to improve with each iteration.

For teams looking to push the boundaries of web performance while maintaining rich, dynamic functionality, Partial Prerendering offers a compelling path forward. While it may still be experimental, the principles and patterns it introduces are already shaping how we think about modern web architecture.

Picture of Wyatt Johnson

Hey, I'm Wyatt! I work on Next.js at Vercel. I'm passionate about open source and building secure, impactful software. Want to chat about better software or collaborate on a project? Find me on GitHub or BlueSky! Any comments or questions on this please reach out via email.

On this page
Understanding the Problem SpaceCore Web Vitals: The North StarLegacy Partial PrerenderingModern Partial PrerenderingThe Promise-Based ApproachLeveraging the Node.js Event LoopThe Prospective Render StrategyPartially Static vs. Fully Static PagesPartially Static PagesFully Static PagesThe Streaming ArchitecturePractical ImplementationEnabling Dynamic I/ODynamic Path ParametersPerformance ImplicationsDeveloper Experience ConsiderationsBenefitsCurrent LimitationsFuture DevelopmentsCDN InteroperabilityProduction ReadinessConclusion