Next.js 15 is the release where the App Router finally feels finished. The async request APIs that broke everyone's code in the canary phase landed cleanly, partial prerendering is usable, server actions are past the "demo only" stage, and Turbopack production builds have moved from stunt to default. Next.js 16 followed in early 2026 and refined the cache model, but most of the patterns that matter for teams running in production today were set in 15. Here is what changed, what finally works, and the places where migrating teams still get burned.
The headline change: async everything
The largest breaking change in Next.js 15 is that dynamic request APIs — cookies(), headers(), draftMode(), and route params and searchParams — all return Promises instead of synchronous values. The upgrade codemod does most of the mechanical work, but any code that assumed synchronous access now has to be awaited. This is not arbitrary. Separating the prerender phase from the render phase is what unlocks partial prerendering; the API shape had to change to make the render model honest.
// Next.js 14 — synchronous, still worked during the deprecation window
export default function Page({ params }: { params: { id: string } }) {
return <Listing id={params.id} />;
}
// Next.js 15+ — params is a Promise. Await it, or use React's use().
export default async function Page({
params,
}: {
params: Promise<{ id: string }>;
}) {
const { id } = await params;
return <Listing id={id} />;
}
// In a Client Component, reach for use() instead of await.
"use client";
import { use } from "react";
export function Client({
params,
}: {
params: Promise<{ id: string }>;
}) {
const { id } = use(params);
return <span>{id}</span>;
}In practice the migration is less painful than the canary noise suggested. What catches teams is not the signature change — it is the assumption that params and searchParams are synchronous buried in utility functions three layers deep. The codemod rewrites the obvious call sites; the subtle ones require a grep and a half-hour of cleanup per project.
Caching: from implicit to explicit
Next.js 14's cache model was notoriously hard to reason about. GET Route Handlers cached by default, fetch() cached by default, the Client Router Cache held pages for five minutes — and every one of those defaults bit someone in production. Next.js 15 flipped the defaults: GET Route Handlers are uncached by default, the Client Router Cache has a staleTime of zero by default, and explicit opt-in is the pattern. The new use cache directive and the cacheComponents flag formalize the model. You declare what is cached, for how long, with what tags — or nothing is cached.
// Opt a server function into the data cache with an explicit lifetime and tag.
import { unstable_cacheLife as cacheLife, unstable_cacheTag as cacheTag } from "next/cache";
export async function getListing(id: string) {
"use cache";
cacheLife("hours");
cacheTag(`listing:${id}`);
const row = await db.listing.findUnique({ where: { id } });
return row;
}
// Invalidate on write.
import { revalidateTag } from "next/cache";
export async function updateListing(id: string, data: Partial<Listing>) {
await db.listing.update({ where: { id }, data });
revalidateTag(`listing:${id}`);
}Cache gotcha: use cache only captures the arguments of the wrapped function, not anything closed over. Reading cookies(), headers(), or auth state inside a cached function quietly invalidates the cache key model and will serve the wrong data to the wrong user. Pass everything the function depends on as an argument, or keep the auth-sensitive work outside the cached boundary.
The related flag — use cache: private — lets you cache per-user data without leaking it between requests. That pattern landed in the 15.x line and is the right default for anything that depends on session identity. For production observability, NEXT_PRIVATE_DEBUG_CACHE=1 prints the cache key, hit/miss status, and tag list to the server log; turn it on in staging while you migrate and leave it off in production.
Partial prerendering, finally usable
Partial prerendering (PPR) is the feature Next.js spent two years building toward. The idea: a single route renders a static shell at build time, streams in the dynamic holes at request time, and the user sees the fast paint while the personalized parts fill in. In Next.js 15 the feature works; in Next.js 16 it became the default rendering model under the "cache components" naming. The pattern that matters for production is simple — wrap the dynamic parts in Suspense with a real fallback, let the static shell ship instantly, and stop arguing about whether a route should be SSG or SSR.
// Static shell, dynamic hole. Header and nav are prerendered; the user widget streams in.
import { Suspense } from "react";
import { UserMenu, UserMenuSkeleton } from "./user-menu";
export default function Layout({ children }: { children: React.ReactNode }) {
return (
<>
<header className="border-b">
<nav>...</nav>
<Suspense fallback={<UserMenuSkeleton />}>
<UserMenu />
</Suspense>
</header>
<main>{children}</main>
</>
);
}Server actions past the demo phase
Server actions — async functions annotated with "use server" that can be called directly from Client Components — went from experimental in Next.js 13 to stable and production-ready in 15. The ergonomic win is real: for most mutation flows, a server action plus useActionState removes the entire route-handler-plus-fetch-wrapper layer. The operational win is conditional on how you secure them. A server action is a publicly reachable endpoint with an opaque ID. Authentication, rate limiting, and input validation are not optional because the function happens to be imported from a component.
- Always validate inputs with a schema library. Zod or Valibot inside the action is the baseline.
- Always check the session at the top of the action. Do not assume the caller is authenticated because the component rendering the form requires auth.
- Wire rate limiting to the action handler, not just to the route. A burst of calls to a server action will happily overwhelm a database the route-level limiter never saw.
- Return errors as structured data, not thrown exceptions, so useActionState can render them predictably.
Turbopack: stable, finally
Turbopack in Next.js 15 shipped development builds as stable and production builds as beta. By Next.js 16 it is the default bundler for new projects and production builds work without a flag. On moderate-sized applications the numbers hold up: roughly 2–5x faster production builds and 5–10x faster Fast Refresh compared to webpack. The practical caveat is that some webpack plugins have no Turbopack equivalent yet — if your project depends on exotic loaders or a custom webpack config, the migration is a real project, not an opt-in flag.
Next.js 13 to 15: the migration pain points
| Area | What changed | Effort |
|---|---|---|
| Dynamic APIs | cookies(), headers(), params, searchParams now return Promises | Low — codemod handles the obvious calls; grep for utility functions |
| Route Handler caching | GET handlers default to uncached; must opt into cache explicitly | Medium — audit every Route Handler, add cache() or use cache as needed |
| Client Router Cache | staleTime defaults to 0; prefetched pages revalidate on navigation | Low — set staleTime in next.config.js if you need the old behavior |
| NextRequest.geo / .ip | Removed; hosting provider must supply these via headers | Low — codemod, but verify headers on your deployment target |
| fetch() caching | No longer cached by default; opt into cache: 'force-cache' explicitly | Medium — common source of silent perf regressions after migration |
| App Router mental model | Client/Server Component boundary, Suspense-first rendering, async data | High for teams migrating from Pages Router — this is the real cost |
| Turbopack | Now default for dev; production builds stable in 16 | Low to medium — depends on how exotic your webpack config is |
What still hurts
- Middleware runs on the Edge runtime by default, which is still a subset of Node. Packages that assume Node APIs — crypto, buffer, some database drivers — will not load there, and the error messages are not always helpful.
- The cache tag API is powerful but undersold. Teams routinely rebuild their own cache invalidation layer because the docs for revalidateTag and cacheTag land too far apart.
- Error boundaries in server components are still rougher than in classic React. A server-side throw inside a Suspense boundary bubbles to error.tsx as an opaque digest; proper observability requires wiring up the error hook to your tracer.
- Streaming SSR interacts awkwardly with some analytics scripts and CSP headers. Budget a day to get Content-Security-Policy right once you enable PPR.
If the team is migrating from the Pages Router, treat the move as a six-to-twelve-week project on anything beyond a small app, not a weekend upgrade. The breaking changes in 15 are mostly mechanical; the mental model shift to server components is the real work.
What we run in production today
On new client projects we default to Next.js 16 on Node runtime, cache components enabled, Turbopack for both dev and prod, server actions for all mutations behind a validation-plus-auth helper, and a disciplined use of use cache with explicit cacheLife and cacheTag on every data-fetch function. We keep middleware thin — session lookup and redirect only — and push any logic that needs Node APIs into route handlers. PPR is on for marketing routes and logged-out flows; logged-in routes ship as fully dynamic because the caching gains are not worth the reasoning cost on per-user surfaces.
Key takeaways
- The async API migration is mechanical. Run the codemod, grep for residual synchronous access, move on.
- Caching went from implicit to explicit. Fewer surprises, more boilerplate. Opt in with use cache, cacheLife, and cacheTag on every data-fetch function that is safe to cache.
- Server actions are production-ready, but they are public endpoints. Validate, authenticate, and rate-limit every one.
- Partial prerendering is the right default for mixed static/dynamic pages. Wrap dynamic holes in Suspense with meaningful fallbacks.
- Turbopack is stable and faster. The migration cost is proportional to how customized your webpack config was.