From error-based detection to promise-driven architecture for optimal web performance
At last year's Next.js Conf, we announced a new rendering paradigm, Partial Prerendering (or PPR). Building on Next.js's strong support for data and route caching, PPR optimizes the way Next.js serves static parts of a page quickly while still allowing server rendering of components that require access to request data.
Since announcing experimental support for the feature, we've iterated on some of the core mechanisms that Next.js uses to detect request data access through a new experimental flag, dynamicIO
.
Before diving into PPR's implementation details, let's establish why this feature matters. In modern web development, we face a fundamental tension between performance and functionality:
This dichotomy forces developers into an all-or-nothing decision at the route level. Even accessing a single cookie marks an entire page as dynamic, sacrificing the performance benefits of static generation.
PPR's design centers around optimizing Core Web Vitals, particularly:
The goal is sub-2.5-second LCP times, though ideally we're targeting hundreds of milliseconds. PPR achieves this by serving static shells immediately while fetching dynamic content in parallel.
When we announced PPR in Next.js 14, it relied on throwing errors to signal to Next.js that request data was accessed. We assumed it would work as well as other APIs we have, such as redirect()
and notFound()
, which still use error throwing as a signal today. The issue arose for developers migrating existing applications who now had errors being thrown in unexpected places.
This was a common pattern that we found even within popular database drivers handling unreliable backends. The retry loop with backoffs would eventually throw the error we used to signal request data access (cookies()
in this case), but only after running through the set of timeouts, dramatically slowing down builds.
Additionally, we discovered cases where fallbacks were returned when operations failed or, worse, the error was shadowed:
In these cases, Next.js couldn't determine which components accessed request data, as it relied on the surrounding suspense boundary to catch the special error that was thrown.
We initially tried to solve this problem by introducing the unstable_rethrow(err)
API:
This would re-throw internal errors that Next.js uses for signaling. However, this introduced a new problem: it required developers to remember to add it. Every call site that could potentially generate a Next.js internal error would need this re-throw function inserted before any other throw in the application code. We felt this wasn't an acceptable developer experience.
We were left with two options. Since we could detect that request data was accessed, but not where, we could either mark the entire page as dynamic and provide a warning to developers, or rethink the detection mechanism. The first option wasn't ideal, as it might eliminate the entire static shell on revalidation, depending on the application's logic. We decided on the latter approach.
We needed a way to suspend the processing of user code when they tried to access request data. Errors were our first choice, but the problems with nested try/catch
blocks meant that these signals that were meant to inform Next.js about request data being used would be unreliable, and possibly lead to longer than expected build times due to the retry logic.
But if we can't throw an error, then what primitives do we have left?
The answer? Promises.
Developers are already adept at managing Promises. With the advent of React Server Components, for the first time, users could write async components, allowing data fetching to be colocated, all while using the native APIs that they were already familiar with, such as fetch.
The key insight was to leverage JavaScript's inherent asynchronous behavior. Instead of throwing errors that could be caught and mishandled, we needed our request APIs to return promises that would never resolve during a static prerender. This approach provides several advantages:
This shift required updating all request APIs to be asynchronous:
The real innovation comes from understanding how the Node.js event loop works. Node.js is single-threaded, meaning it can only run one block of synchronous code at a time. When it needs to perform I/O operations, it offloads them to native code.
Here's the crucial insight: I/O that's non-deterministic can't complete in the same Task.
Next.js takes advantage of this by using a clever scheduling mechanism:
This approach schedules a task to prerender the application, then immediately schedules another task to abort it. Node.js processes all microtasks (like resolving already-resolved promises) before moving to the next Task, allowing synchronous operations to complete while preventing asynchronous I/O from executing.
One challenge remained: how to include external data in the static shell? Data from a CMS or database requires async fetches that won't complete within a single event loop task.
The solution is the prospective render:
During build time, Next.js performs a prospective render to fill cache entries for any cacheable data. This render's output is discarded, but the caches are primed. When the actual prerender occurs, these cached values can be resolved synchronously within the first task, allowing them to be included in the static shell.
This leads to two categories of pages in PPR:
Pages that access uncached data or request information:
These require the resume render to stream in dynamic content.
Pages where all async operations are cached:
These can be served entirely from the edge without any origin invocation.
PPR uses a single HTTP response stream to deliver both static and dynamic content. This approach minimizes roundtrips and optimizes performance:
This timing is critical - while the browser downloads static resources (CSS, JS) hinted by the static shell, the server is already rendering dynamic content.
Let's examine how PPR works in practice with a real-world example:
In this example:
Header
and ProductListing
are static and included in the initial shellCart
accesses cookies, making it dynamicCartSkeleton
is part of the static shellCurrently, PPR requires two experimental flags:
The dynamicIO
flag enables the promise-based detection mechanism. The plan is to eventually remove this flag as PPR stabilizes.
A special consideration for pathname-based APIs: with traditional rendering, dynamic paths like /products/[id]
would force the entire page to be dynamic. With PPR and async params:
This enables fallback static shells that work for any product ID, providing static-like performance even for dynamic routes.
The performance benefits of PPR are substantial:
While PPR offers significant performance benefits, it's important to understand its impact on development:
Currently, the single-stream approach works on Vercel and self-hosted deployments, but broader CDN support is planned once PPR stabilizes. This will involve creating standardized protocols for CDNs to handle the static shell and dynamic streaming pattern.
While PPR represents a significant advancement, it remains experimental. The Next.js team is working towards:
dynamicIO
flag requirementPartial Prerendering represents a paradigm shift in how we think about web application rendering. By combining static shells with streaming dynamic content, PPR offers:
The journey from error-based detection to promise-based mechanisms showcases the thoughtful evolution of this feature. While challenges remain around platform compatibility and production readiness, PPR represents the future of web application rendering - one where developers no longer need to choose between speed and functionality.
As we move forward, the goal is clear: make PPR the default rendering model for web applications, bringing together the best of static site generation and dynamic delivery without compromising on either. The technical foundations are solid, the performance benefits are clear, and the developer experience continues to improve with each iteration.
For teams looking to push the boundaries of web performance while maintaining rich, dynamic functionality, Partial Prerendering offers a compelling path forward. While it may still be experimental, the principles and patterns it introduces are already shaping how we think about modern web architecture.