← Back to Blog

GraphQL Caching Strategies for Modern APIs

December 22, 2025 • 8 min read • API Performance

GraphQL's flexibility is its greatest strength—and its biggest caching challenge. Unlike REST where each endpoint has a single response, GraphQL queries can request any combination of fields. Here's how to cache effectively.

Why GraphQL Caching Is Different

REST caching is straightforward: cache by URL. GraphQL sends everything to one endpoint with varying query bodies. Traditional HTTP caching doesn't work.

You need a multi-layer approach:

  1. Client-side normalized cache (Apollo, Relay)
  2. CDN/edge caching for persisted queries
  3. Server-side response caching
  4. Resolver-level caching (DataLoader)

Layer 1: Client-Side Normalized Caching

Apollo Client and Relay automatically normalize and cache query results:

// Apollo Client setup with cache
import { ApolloClient, InMemoryCache } from '@apollo/client';

const client = new ApolloClient({
    uri: '/graphql',
    cache: new InMemoryCache({
        typePolicies: {
            User: {
                keyFields: ['id'],  // How to identify cached entities
            },
            Product: {
                keyFields: ['sku'],
            }
        }
    })
});

This means if you fetch a user in Query A, and Query B also needs that user, Apollo serves it from cache automatically.

Layer 2: Persisted Queries for CDN Caching

Convert POST requests with query bodies into GET requests with query IDs:

// Without persisted queries (not cacheable by CDN)
POST /graphql
{ "query": "{ user(id: 1) { name email } }" }

// With persisted queries (cacheable!)
GET /graphql?id=abc123&variables={"id":1}

// Server maps ID to query
const persistedQueries = {
    'abc123': '{ user(id: $id) { name email } }'
};
Performance gain: Persisted queries are typically 10x faster because CDNs can cache the response, and you're sending less data over the wire.

Layer 3: Server-Side Response Caching

Cache full query responses by hashing the query + variables:

async function executeQuery(query, variables, context) {
    // Generate cache key from query + variables
    const cacheKey = `gql:${hash(query)}:${hash(variables)}`;

    // Check cache
    const cached = await cache.get(cacheKey);
    if (cached) return cached;

    // Execute query
    const result = await graphql(schema, query, null, context, variables);

    // Cache based on response hints
    const ttl = getMinTTL(result);  // Scan response for cache directives
    if (ttl > 0) {
        await cache.set(cacheKey, result, { ttl });
    }

    return result;
}

Layer 4: DataLoader for N+1 Prevention

DataLoader batches and caches database requests within a single request:

import DataLoader from 'dataloader';

// Create loader per request
function createLoaders() {
    return {
        user: new DataLoader(async (ids) => {
            const users = await db.query(
                'SELECT * FROM users WHERE id = ANY($1)',
                [ids]
            );
            // Return in same order as requested
            return ids.map(id => users.find(u => u.id === id));
        }),

        products: new DataLoader(async (ids) => {
            // Similar batching for products
        })
    };
}

// In resolver
const resolvers = {
    Order: {
        user: (order, _, { loaders }) => {
            return loaders.user.load(order.userId);
        }
    }
};

If a query fetches 50 orders, DataLoader batches all 50 user lookups into one database query.

Cache Directives in Schema

Define caching rules directly in your GraphQL schema:

type Query {
    # Highly cacheable - rarely changes
    categories: [Category!]! @cacheControl(maxAge: 3600)

    # User-specific - shorter cache
    me: User @cacheControl(maxAge: 60, scope: PRIVATE)

    # Real-time data - no cache
    livePrice(symbol: String!): Price @cacheControl(maxAge: 0)
}

type Product {
    id: ID!
    name: String! @cacheControl(maxAge: 600)

    # Inventory changes frequently
    stockCount: Int! @cacheControl(maxAge: 30)
}

Invalidation Strategies

The hardest part of GraphQL caching is invalidation. Options:

  1. Time-based (TTL): Simple but may show stale data
  2. Entity-based: Track which entities are in each cached response
  3. Event-driven: Invalidate on mutations
// Track entities in cached responses
async function cacheWithTracking(cacheKey, result, ttl) {
    // Extract entity IDs from response
    const entities = extractEntities(result);
    // e.g., ['User:1', 'Product:42', 'Product:43']

    // Store response
    await cache.set(cacheKey, result, { ttl });

    // Track which cache keys contain each entity
    for (const entity of entities) {
        await cache.sadd(`entity:${entity}:keys`, cacheKey);
    }
}

// When entity changes, invalidate all related cache keys
async function invalidateEntity(entityType, entityId) {
    const cacheKeys = await cache.smembers(`entity:${entityType}:${entityId}:keys`);
    await cache.del(...cacheKeys);
}

GraphQL caching made simple

Cachee.ai automatically handles GraphQL response caching with intelligent invalidation.

Start Free Trial