Next.js has six caching layers. They conflict with each other. Router Cache serves stale data after mutations. Data Cache ignores your revalidate settings. The new use cache directive that shipped in Next.js 15 is confusing everyone who tries to use it alongside existing caching strategies. You are not imagining it — Next.js caching in 2026 is genuinely broken out of the box, and the framework’s own documentation admits it has been “confusing.” Here is exactly what is going wrong, how to fix the three bugs everyone hits, and why a proper L1 cache layer is the only way to make Next.js caching actually reliable.
The 6 Caching Layers That Fight Each Other
Next.js does not have one cache. It has six, and they operate at different levels of the stack with different lifecycles, different invalidation rules, and different defaults. Understanding why your data is stale requires understanding which layer is holding onto it — and in many cases, multiple layers are caching the same data with conflicting TTLs.
- Request Memoization. Deduplicates identical
fetch()calls within a single server render. Lasts for one request lifecycle. Harmless on its own — but developers confuse it with actual caching and expect it to persist across requests. It does not. - Data Cache. Persists
fetch()responses across requests and deployments on the server. This is the layer that ignores yourrevalidatesetting when the Full Route Cache has already captured the entire page. The Data Cache stores the raw fetch response, but the Full Route Cache stores the rendered HTML — so even if the Data Cache refreshes, the page does not. - Full Route Cache. Caches the rendered HTML and React Server Component payload for static routes at build time. This is server-side. Once a route is fully cached here, individual Data Cache revalidations do not trigger a re-render unless you explicitly call
revalidatePath()orrevalidateTag(). - Router Cache. Client-side. Stores React Server Component payloads in the browser for 30 seconds (dynamic pages) or 5 minutes (static pages). This is the layer that serves stale data after a mutation, because it has no idea a server action just changed the underlying data. It was so problematic that Next.js 15 reduced the default TTL — but it still caches, and it still serves stale content.
use cacheDirective. Introduced in Next.js 15 as an experimental feature, now more widely adopted. Lets you mark entire components or functions as cacheable with"use cache"at the top of the file. Designed to replace the oldfetch-level caching model. The problem: it interacts unpredictably with the Data Cache and Full Route Cache. If ause cachefunction callsfetch()with its ownrevalidate, which cache wins? The answer depends on the rendering mode, the route segment config, and whether you are in a layout or a page. It is not documented clearly.- Fetch Cache (
force-cache/no-store). Per-fetch configuration that tells Next.js how to treat individualfetch()calls. In theory,no-storebypasses the Data Cache. In practice, the Full Route Cache and Router Cache can still serve stale versions of the page that contains that fetch, makingno-storeappear to do nothing.
revalidateTag() call updates the Data Cache on the server — but the Router Cache on the client still serves the old payload for up to 30 seconds. And if the Full Route Cache captured the page at build time, the re-render never triggers until the next request after revalidation.
The 3 Bugs Everyone Hits
1. Stale Data After Mutation (Router Cache)
You submit a form via a Server Action. The action updates the database. You call revalidatePath(’/dashboard’). You navigate back to /dashboard. The old data is still there. You refresh the page. Now it is correct. This is the Router Cache. It cached the Server Component payload on the client when you first visited /dashboard, and your server-side revalidatePath() call has no way to reach into the browser and invalidate it.
2. Fetch Not Revalidating (Data Cache Override)
You set revalidate: 60 on a fetch() call inside a Server Component. You wait 90 seconds. The data has not changed. The problem: your route segment config has export const dynamic = ’force-static’ (or the route was statically analyzed as fully static at build time). The Full Route Cache captured the entire rendered page, including all fetch results, at build time. Your per-fetch revalidate: 60 is irrelevant because the page itself is never re-rendered.
3. ISR Pages Stuck (Full Route Cache)
You are using Incremental Static Regeneration. You set revalidate: 300 on a page. Five minutes pass. The page does not update. The first visitor after the 300-second window sees the stale page. Only the second visitor gets the fresh version. This is by design — ISR uses a stale-while-revalidate pattern where the first request after expiry triggers a background regeneration, but serves the stale version. The fix is to use on-demand revalidation via revalidateTag() triggered by a webhook from your CMS or database, so the page is always fresh when a user requests it.
router.refresh() on the client after Server Actions. (2) Set dynamic = "force-dynamic" on routes that need per-fetch revalidation, or use revalidateTag() for on-demand invalidation. (3) Replace time-based ISR with webhook-driven revalidateTag() calls so pages update immediately when data changes, not on the next-plus-one request.
Why Next.js Built-in Caching Isn’t Enough
Even after fixing the three bugs above, you are still working within fundamental architectural constraints. Next.js caching is per-instance. If you run three replicas behind a load balancer, each one maintains its own Data Cache and Full Route Cache independently. A revalidation on instance A does not propagate to instances B and C. Users hitting different instances see different versions of the same page. There is no cross-instance cache sharing out of the box.
Next.js also has no predictive warming. Every cache starts cold. After every deployment, every page is uncached. The first visitor to every route pays the full rendering cost — database queries, API calls, component rendering — while the cache rebuilds. For applications with thousands of routes, this cold-start window can last minutes. During that window, your origin servers absorb the full traffic load with zero caching benefit. And the Router Cache, being client-side only, provides zero help to new visitors who have never loaded the page before. See cache warming strategies for approaches to eliminating cold starts.
Adding an L1 Cache Layer to Next.js
The architectural fix is to stop relying on Next.js’s built-in caching layers for performance and add a dedicated L1 cache that sits between your data fetching and the origin. Cachee’s SDK integrates directly into your Next.js application — it works inside getServerSideProps, Server Components, API routes, and Route Handlers. Instead of fighting six caching layers with six different invalidation strategies, you get one cache with one behavior: 1.5µs in-process lookups with predictive pre-warming that eliminates cold starts entirely.
The SDK wraps your existing data fetching functions. No rewrite required. It intercepts cache reads and serves them from in-process memory — no serialization, no network hop, no TTL-based expiration surprises. Invalidation is event-driven: when the underlying data changes, the L1 cache updates across all instances in real time. No stale data from Router Cache conflicts. No silent Full Route Cache overrides. No ISR stale-while-revalidate delays. One cache layer, one source of truth, one invalidation model. It works with every Next.js rendering pattern — SSR, SSG, ISR, App Router, Pages Router — because it operates below the framework’s caching stack, not inside it. See API latency optimization for how this applies to your API routes specifically.
Before and After
A real Next.js API route serving aggregated dashboard data. The endpoint queries a PostgreSQL database, joins three tables, and returns a 45KB JSON payload. With default Next.js caching, the route is dynamic (no Data Cache) and every request hits the database. With Cachee L1, the first request warms the cache; every subsequent request is an in-process hash table lookup.
Next.js API Route — No External Cache
Next.js API Route + Cachee L1
The 180ms-to-12ms improvement comes from eliminating the database query and result processing entirely on cache hits. The remaining 12ms is network transmission — the time to send the pre-serialized response to the client. The actual cache lookup is 1.5 microseconds. On subsequent requests, your Next.js server is effectively a static file server for that endpoint — except the data stays fresh through event-driven invalidation rather than going stale behind six competing cache layers.
Further Reading
- Predictive Caching: How AI Pre-Warming Works
- API Latency Optimization
- Cache Warming Strategies
- Why Your Cache Isn’t Improving Performance
- How to Reduce Redis Latency in Production
- How to Increase Cache Hit Rate
- Cachee Performance Benchmarks
Fix Next.js Caching Once. Never Think About It Again.
One L1 cache layer. No stale data. No cold starts. No fighting six caching layers that conflict with each other.
Start Free Trial Schedule Demo