Cache fragments independently. Compose them at read time in microseconds. When user.name changes, user.orders stays cached. Zero over-invalidation.
Every time you cache a full API response, you are creating an invalidation bomb. One field changes and the entire entry is evicted — even though 95% of the data is still valid.
Store each piece of a response as an independent cache entry. Use the FUSE command to assemble them into a complete response at read time. Composition happens via pointer assembly — no serialization, no deserialization. Microseconds.
FUSE does not deserialize fragments, merge JSON objects, or run any transform logic. Each fragment is already stored as a cache value. The engine reads the pointers, assembles a composite structure, and returns it. The cost is a handful of pointer reads — typically under 5 microseconds for a 4-fragment composition.
This means composition cost is constant regardless of fragment size. A 10KB order history and a 50-byte name field are both a single pointer read. The engine does not touch the payload data during assembly.
The same fragment is reused by every FUSE that references it. A dashboard FUSE and a profile FUSE can both reference user:123:orders. The fragment is stored once. Two compositions, one cache entry.
This is the key insight for GraphQL. Different query shapes are just different FUSE operations over the same fragments. A query for {user{name,orders}} and a query for {user{name,billing}} share the user:123:name fragment. Fragment caching solves the combinatorial explosion of query shapes.
Monolithic caching forces full invalidation on any change. Fragment caching isolates changes to the piece that actually updated. The hit rate difference is not incremental — it is structural.
| Metric | Monolithic Caching | Cache Fusion (Fragments) |
|---|---|---|
| Cache Hit Rate | 60–70% | 90–95% |
| Invalidation Scope | Entire response (all fields) | Single fragment (changed field only) |
| GraphQL Query Caching | One entry per query shape | Shared fragments across all shapes |
| Recomputation on Change | Full response rebuild | Fetch 1 fragment, compose cached rest |
| Memory Efficiency | Duplicate data across entries | Fragments stored once, reused everywhere |
Every GraphQL resolver's data becomes a cache fragment. The FUSE operation assembles the response shape at the cache layer. Different query shapes reuse the same fragments. The combinatorial explosion of query shapes stops being a problem.
Cache Fusion is not a standalone feature. Fragments can have dependencies. CDC can trigger fragment invalidation. The composition of these primitives creates behavior that no other caching system can replicate.
user:123:orders is invalidated. Name, prefs, and billing fragments stay hot. The next FUSE assembles 1 fresh fragment + 3 cached fragments.Learn more about causal dependency graphs, CDC auto-invalidation, and cross-service coherence.
Cache Fusion is a fragment composition system that stores individual pieces of a response as independent cache entries and assembles them at read time using the FUSE command. When one fragment is invalidated, only that fragment is evicted — every other fragment in the composition remains cached. This eliminates over-invalidation and dramatically increases cache hit rates for composite responses.
FUSE takes a destination key and a list of source fragment keys. For example: FUSE user:123:profile FROM user:123:name user:123:prefs user:123:orders user:123:billing. The cache engine reads each fragment, assembles them into a single response object, and returns the composed result. Composition happens via pointer assembly in microseconds — no serialization or deserialization is required.
Each GraphQL resolver's data is stored as an independent cache fragment. The FUSE operation assembles the response shape at the cache layer, matching the query's field selection. Different query shapes reuse the same underlying fragments, so a query requesting user.name and user.orders shares fragments with a query requesting user.name and user.billing. Fragment reuse across query shapes is what makes GraphQL caching practical.
Over-invalidation occurs when a cache entry is evicted because any part of its data changed, even though most of the data is still valid. With monolithic caching, updating a user's name invalidates the entire user response — including orders, preferences, and billing data that did not change. Cache Fusion solves this by caching each piece independently. Only the name fragment is invalidated; every other fragment stays hot. Hit rates improve from 60–70% (monolithic) to 90–95% (fragment-based).
Composition adds negligible latency — typically under 5 microseconds for a 4-fragment assembly. The FUSE operation performs pointer assembly, not serialization. All fragments are already in memory as cached values. The engine reads each pointer, assembles the result structure, and returns it. For most workloads, the composition overhead is invisible compared to the latency savings from avoiding cache misses on the entire response.
Fragment composition. Dependency graphs. CDC auto-invalidation. Cross-instance coherence. Zero over-invalidation. Zero application code.