Skip to main content
Why CacheeHow It Works
All Verticals5G TelecomAd TechAI InfrastructureFraud DetectionGamingTrading
PricingDocsBlogSchedule DemoLog InStart Free Trial

Why Is My Cloudflare Cache Hit Rate Low?

10/24/2025 8 min read Infrastructure
Back to Blog

You have Cloudflare in front of your application. Your dashboard shows a cache hit rate of 40–60%. Maybe lower. You expected 80%+. Static assets should be cached. API responses that rarely change should be cached. Yet the majority of requests are passing straight through to your origin, burning compute and adding latency. The problem is almost never Cloudflare itself — it is the interaction between your application's HTTP behavior and Cloudflare's caching rules. Here are the six most common reasons your hit rate is low, how to diagnose each one, and what to do about the requests that Cloudflare fundamentally cannot cache.

Reason 1: Cache-Control Headers Are Wrong

This is the single most common cause of low Cloudflare cache hit rates. Your origin server is sending Cache-Control headers that explicitly tell Cloudflare not to cache the response. If your origin sends Cache-Control: no-store, Cache-Control: private, or Cache-Control: no-cache, Cloudflare respects those directives and passes the response through uncached — every single time.

Many web frameworks set restrictive cache headers by default. Express.js does not set Cache-Control at all on custom routes (which Cloudflare treats as uncacheable for non-static content types). Django sets Cache-Control: max-age=0 on most views. Rails adds Cache-Control: private, no-cache to any response that touches a session. If you have not explicitly configured caching headers on your origin, your framework is likely telling Cloudflare to bypass the cache on every dynamic response.

Check your headers with curl:

curl -sI https://yoursite.com/api/products | grep -i cache-control
# Bad:  Cache-Control: no-store, no-cache, must-revalidate
# Good: Cache-Control: public, max-age=3600, s-maxage=86400

The fix is to set explicit Cache-Control headers on your origin for every response you want Cloudflare to cache. Use s-maxage to control CDN cache duration independently from browser cache duration:

# Nginx — cache product pages for 1 hour at the CDN, 5 minutes in browser
location /api/products {
    add_header Cache-Control "public, s-maxage=3600, max-age=300";
}

Reason 2: Query String Variations

Cloudflare uses the full URL — including query parameters — as the cache key by default. This means /products?page=1&sort=name and /products?sort=name&page=1 are two different cache entries, even though they return identical data. Analytics parameters make this worse: /products?utm_source=email&utm_campaign=spring creates a unique cache entry for every marketing campaign link.

Cache-busting timestamps are another culprit. If your frontend appends ?_t=1698134400 to every request, every request generates a unique cache key and Cloudflare never serves from cache. A single query parameter that changes per-request is enough to reduce your effective hit rate to zero on that endpoint.

The fix depends on your Cloudflare plan. On Pro and above, use Cache Rules (the successor to Page Rules) to configure which query parameters are included in the cache key. Strip analytics parameters and cache-busting tokens entirely. Sort remaining parameters to normalize order:

# Cloudflare Cache Rule — strip tracking params from cache key
Match: hostname equals "yoursite.com" AND URI path starts with "/products"
Cache Key:
  - Query string: Include only "page", "sort", "category"
  - Sort query string: On

Reason 3: Cookies Bypassing Cache

Cloudflare's default behavior bypasses the cache entirely when the origin response contains a Set-Cookie header. This is a security measure — caching a response with Set-Cookie could serve one user's session cookie to another user. But it also means that if your origin sets a cookie on every response (analytics cookies, session cookies, A/B test assignment cookies), every response is uncacheable by default.

The same applies to request cookies. If your application sends cookies on requests for static or semi-static content, Cloudflare may treat those requests as personalized and skip the cache. WordPress sites are notorious for this — even logged-out visitors often carry cookies that prevent caching.

The fix is to ensure your origin only sets cookies when genuinely necessary. Move analytics cookies to client-side JavaScript. Strip unnecessary cookies from requests at the CDN layer using a Cloudflare Worker or Transform Rule. For responses that are safe to cache despite cookies, use a Cache Rule to override the default behavior:

# Cloudflare Cache Rule — cache despite cookies on public pages
Match: URI path starts with "/blog" OR URI path starts with "/products"
Cache eligibility: Eligible for cache
Edge TTL: Override origin, 1 hour
Browser TTL: Override origin, 5 minutes

Reason 4: Dynamic Content Not Marked Cacheable

Cloudflare only caches certain file extensions by default: .js, .css, .png, .jpg, .gif, .ico, .svg, and about 30 others. If your API returns JSON at /api/products, Cloudflare does not cache it unless you explicitly tell it to — regardless of what Cache-Control header your origin sends. HTML pages are also not cached by default on the free plan.

This is the second most common surprise. Teams set perfect Cache-Control headers on their API responses, check the Cloudflare dashboard, and see zero cache hits on those endpoints. The response content type is application/json, which is not in Cloudflare's default cacheable list.

The fix is a Cache Rule that explicitly marks these responses as cacheable:

# Cache Rule — force-cache JSON API responses
Match: URI path starts with "/api/" AND URI path does not contain "/api/auth/"
Cache eligibility: Eligible for cache
Edge TTL: Use origin Cache-Control header
Respect origin: On

Reason 5: POST Requests and Authenticated Endpoints

Cloudflare never caches POST, PUT, PATCH, or DELETE requests. This is correct behavior — these methods have side effects. But many applications use POST for operations that are semantically reads: GraphQL queries, search endpoints, complex filter operations. Every one of these requests passes through to your origin uncached.

Authenticated endpoints are similarly uncacheable at the CDN level. Any request that carries an Authorization header or session token is per-user by definition. Cloudflare cannot cache these responses without risking serving User A's data to User B. If 60% of your traffic is authenticated API requests, 60% of your traffic is fundamentally uncacheable at the CDN layer — and no amount of Cloudflare configuration will change that.

The CDN ceiling: If your traffic mix is 30% static assets, 20% public API calls, and 50% authenticated requests, your theoretical Cloudflare cache hit rate maximum is 50%. The other 50% must always pass through to your origin. This is where application-layer caching becomes essential.

Diagnosing with cf-cache-status

Before changing any configuration, diagnose the current state. Cloudflare adds a cf-cache-status header to every response that tells you exactly what happened:

StatusMeaning
HITServed from Cloudflare cache
MISSNot in cache, fetched from origin (will be cached for next request)
EXPIREDWas cached but TTL expired, re-fetched from origin
DYNAMICCloudflare determined this content type is not cacheable
BYPASSCache deliberately skipped (cookies, Cache-Control directive)

Run this against your most-trafficked endpoints:

curl -sI https://yoursite.com/api/products | grep cf-cache-status
# cf-cache-status: DYNAMIC  ← Cloudflare won't cache this content type

curl -sI https://yoursite.com/style.css | grep cf-cache-status
# cf-cache-status: HIT  ← Working correctly

If you see DYNAMIC on endpoints you expected to be cached, you need a Cache Rule to override content-type detection. If you see BYPASS, check for Set-Cookie headers or restrictive Cache-Control directives. The Cloudflare Cache Analytics dashboard (under Caching → Cache Analytics) breaks down cache status across all requests, so you can see the aggregate distribution without curling individual URLs.

Page Rules vs Cache Rules

Cloudflare is migrating from Page Rules to Cache Rules. If you are still using Page Rules for caching, migrate now. Cache Rules offer more granular matching (hostname, path, query string, headers, cookies), more actions (custom cache keys, tiered caching, origin cache control respect), and better performance. Page Rules are limited to 3 on the free plan and match only on URL patterns. Cache Rules support up to 10 free rules with full request-attribute matching.

When CDN Caching Is Not Enough

Even with perfect Cloudflare configuration, there is a class of requests that CDNs cannot cache: authenticated API calls, personalized content, POST-based queries, real-time data with sub-second freshness requirements, and any response that varies per user. For most production applications, this uncacheable traffic represents 40–70% of total requests. It is also the most expensive traffic — these are the database queries that take 50–200ms, not the static assets that return in 1ms.

This is the gap that an application-layer L1 cache fills. Unlike a CDN, an L1 cache sits inside your application process. It can cache authenticated responses per-user without cross-user leakage. It can cache POST request results by hashing the request body. It can serve personalized content from memory in microseconds rather than milliseconds. And because it operates at the application layer, it has access to business logic that a CDN never will — it knows which responses are safe to cache, for how long, and when to invalidate them.

The combination of CDN edge caching for public static and semi-static content, plus an L1 application cache for authenticated and dynamic content, is what pushes overall cache hit rates from 50–70% to 95%+. The CDN handles the easy traffic. The L1 cache handles the expensive traffic. Together, they reduce origin load by 10–20x compared to CDN-only caching.

A real example: A SaaS application running Cloudflare CDN had a 52% cache hit rate. After fixing Cache-Control headers and adding Cache Rules, the CDN hit rate rose to 74%. After adding a Cachee L1 cache for authenticated API responses, the effective hit rate — including requests Cloudflare could never cache — reached 97%. Origin database queries dropped from 1.2M/hour to 36K/hour. Weighted miss cost fell by 94%.

Related Reading

Also Read

Ready to Cache What Cloudflare Cannot?

Cachee adds an L1 application-layer cache that handles authenticated, dynamic, and personalized requests — the traffic CDNs pass straight through to your origin.

Start Free Trial View Demo