Proof Reuse ZK-STARK Post-Quantum Signed

Proof Reuse

A verified proof is a mathematical fact. Re-verifying it is redundant computation.
Verify once. Serve the truth forever at 85 nanoseconds.

294x
STARK Speedup
1
Verification Per Proof
0
Re-verifications Per Consumer
Signed
3 PQ Families
Definition

Proof reuse is the practice of caching the verification result of a cryptographic proof so that it is verified once and served to all subsequent consumers without re-execution. A STARK proof verified by one node becomes a reusable truth claim for every node. The cached result is bound to the exact proof via a computation fingerprint and signed by three independent post-quantum signature families. Verification: 25 microseconds. Cached truth: 85 nanoseconds. Every re-verification after the first is pure waste.

The Principle

A Proof Becomes a Fact

Verification is the one-time cost of converting a proof into a fact. Once converted, the fact stands on its own.

📜
Proof
A mathematical argument that a statement is true. Encoded as bytes. Can be verified. Verification is expensive: FRI checks, constraint evaluation, Merkle traversal, field arithmetic. 25 microseconds per STARK. 1-3 milliseconds per SNARK.
Verification
(one-time cost)
Fact
A verified truth claim. Fingerprinted and signed. Can be served instantly. Verification cost: zero. Lookup cost: 85 nanoseconds. A fact does not need re-proving. It needs distribution.

Every system that consumes proofs re-verifies them. Every re-verification produces the same result. The computation is identical. The result is identical. The cost is not. Proof reuse eliminates the cost by caching the result of the first verification and serving it to every subsequent consumer.

The Waste

Where Re-Verification Burns Compute

Every consumer re-verifies. None of them need to. Here is what that costs.

L2 Rollup Validators
1,000 full nodes each verify the same state transition proof on every block. The proof is identical. The result is identical. 999 verifications are redundant.
25ms
1,000 nodes x 25us
25us + 85ns x 999
With proof reuse
294x saved
Per proof, per block
Cross-Chain Bridge Validators
Every bridge validator re-verifies the same cross-chain state proof. 50 validators, same proof, same result. 49 re-verifications produce nothing new.
1.25ms
50 validators x 25us
25us + 85ns x 49
With proof reuse
48x saved
Per bridge transfer
Microservice Auth Proofs
An auth proof (credential ZKP) is verified by every downstream microservice. 12 services in the request path, each re-verifying the same credential proof.
36ms
12 services x 3ms SNARK
3ms + 85ns x 11
With proof reuse
11x saved
Per request
DeFi Protocol Auditors
Proof of reserves verified by every participant who queries solvency. 10,000 queries per hour, same proof. After the first verification, 9,999 are waste.
250ms
10K queries x 25us
25us + 850us
With proof reuse
285x saved
Per hour
Visualization

1,000 Nodes. One Verification.

Without proof reuse, every node does 25us of work. With it, one node does 25us and 999 do 85ns.

Without Proof Reuse: 1,000 independent verifications

1
Node 1
STARK verify: FRI + constraints + Merkle + OOD
25 us
2
Node 2
STARK verify: FRI + constraints + Merkle + OOD
25 us
...
Nodes 3-999
997 identical verifications
25 us each
1K
Node 1000
STARK verify: FRI + constraints + Merkle + OOD
25 us
Total: 25,000 us of computation. Same proof. Same result. 999 wasted.

With Proof Reuse: 1 verification + 999 cache lookups

1
Node 1
STARK verify (first and only verification)
25 us
2
Node 2
85 ns
...
Nodes 3-999
85 ns each
1K
Node 1000
85 ns
Total: 25 us + 85 us = 110 us. 227x less compute. Same security guarantee.
Trust Model

How Consumers Trust the Cached Result

You do not trust the cache. You trust the math. Three layers make forgery computationally infeasible.

1

Computation Fingerprint (Proves Identity)

The cache key is a SHA3-256 hash of the proof bytes, verification key, public inputs, constraint set, field parameters, and verifier version. Anyone can recompute it from the original proof and parameters. If the fingerprint matches, the cache entry corresponds to exactly this proof. No collision is possible without breaking SHA3-256.

2

PQ Signatures (Proves Authenticity)

The verification result is signed by three independent post-quantum signature families: ML-DSA-65 (MLWE lattices), FALCON-512 (NTRU lattices), and SLH-DSA (stateless hash functions). Three independent mathematical hardness assumptions. Forging a cached result requires breaking all three simultaneously.

3

Independent Verification (Proves Correctness)

The cachee-verify tool checks any cached result against the signatures and fingerprint with no network call, no Cachee account, and no trust in H33. Anyone can independently verify that a cached truth claim is authentic and corresponds to the correct proof. The verifier is open-source.

// The computation fingerprint binds the cache entry to the exact proof fingerprint = SHA3-256( proof_bytes // the proof itself || verification_key // the circuit's verification key || public_inputs // any public inputs to the proof || constraint_set // the constraint system identifier || field_parameters // prime field, extension degree || verifier_version // software version of the verifier ) // Two different proofs -> different fingerprints. // Same proof, same params -> same fingerprint -> cache hit.

You don't trust the cache. You verify it once, then trust the math. The fingerprint proves identity. The signatures prove authenticity. The verifier proves correctness. Three layers, three independent guarantees.

Disambiguation

Proof Reuse vs Proof Aggregation

They solve different problems. They are complementary. Use both.

Proof Aggregation

Reduces proof count. Takes N individual proofs and combines them into a single aggregated proof. The aggregated proof can be verified once to confirm all N constituent proofs are valid. The verification cost of N proofs becomes the verification cost of one aggregated proof.

  • Input: N proofs
  • Output: 1 aggregated proof
  • Verification: once per aggregated proof
  • Use case: reducing on-chain verification costs

Proof Reuse

Reduces verification count per proof. Takes a single proof (or a single aggregated proof), verifies it once, caches the verification result, and serves it to every subsequent consumer at cache-lookup speed. The verification cost per consumer drops from full verification to a hash lookup.

  • Input: 1 proof (original or aggregated)
  • Output: 1 cached verification result
  • Verification: once ever, across all consumers
  • Use case: eliminating redundant re-verification

The optimal pipeline: aggregate first (reduce N proofs to 1), then cache the aggregated proof's verification result (reduce M consumers to 1 verification). Aggregation eliminates N-1 proofs. Reuse eliminates M-1 verifications. Together: verify once, for all proofs, for all consumers.

Architecture

The Proof Reuse Pipeline

Proof arrives. Fingerprint computed. Cache checked. Miss: verify, sign, cache. Hit: return the truth at 85ns.

IN
Proof arrives (STARK, SNARK, or aggregated proof)
--
FP
Compute SHA3-256 fingerprint (proof || vk || inputs || constraints || field || version)
~200 ns
?
Cache lookup by fingerprint
31 ns
Cache Miss (first time only) 1. Full verification: 25us STARK / 1-3ms SNARK
2. Sign result: ML-DSA-65 + FALCON-512 + SLH-DSA
3. Store: fingerprint -> {result, signatures, timestamp}
4. Return verified result to consumer
Cache Hit (every subsequent time) 1. Retrieve cached result by fingerprint: 85ns
2. Return signed truth claim to consumer
3. No verification. No field arithmetic. No Merkle traversal.
4. 294x faster. Zero re-computation.
Consumer receives the truth. Fingerprint and signatures are verifiable offline.
85 ns

After the first consumer triggers verification, every subsequent consumer for the lifetime of the proof receives the cached result at 85 nanoseconds.

Economics

The Cost of Not Reusing

Every re-verification burns CPU cycles that produce no new information. At scale, the waste is quantifiable. Here is what proof reuse saves across different verification volumes, assuming STARK verification at 25 microseconds and cached lookup at 85 nanoseconds.

Verifications/Day Unique Proofs Without Reuse With Reuse CPU-Hours Saved
100K 1,000 2.5 CPU-sec/day 0.025 sec + 8.4ms ~2.5 sec
1M 5,000 25 CPU-sec/day 0.125 sec + 84.6ms ~24.8 sec
10M 10,000 250 CPU-sec/day 0.25 sec + 849ms ~248.9 sec
100M 50,000 2,500 CPU-sec/day 1.25 sec + 8.5sec ~2,490 sec (41.5 min)
1B 100,000 25,000 CPU-sec/day 2.5 sec + 85sec ~24,912 sec (6.9 hrs)

At 1 billion verifications per day with 100,000 unique proofs, proof reuse saves nearly 7 CPU-hours daily. The savings scale linearly with the re-verification ratio (verifications per unique proof). A rollup with 1,000 validators re-verifying 100 proofs per day saves 1,000x the single-verification cost.

The compute savings are only half the story. Proof reuse also eliminates the tail latency spike from verification. Every consumer gets 85ns instead of a distribution from 25us to 50us (STARK) or 1ms to 8ms (SNARK). Consistent, predictable, cache-speed latency.

cachee-proof-reuse-demo
[1] STARK proof arrives: 47KB, ethSTARK over Goldilocks field
[2] Fingerprint: SHA3-256(proof || vk || inputs || constraints) = 0xa3f1...
[3] Cache lookup: MISS
[4] Verify: FRI + constraints + Merkle + OOD = 25us VALID
[5] Sign: ML-DSA-65 + FALCON-512 + SLH-DSA
[6] Cache: SET 0xa3f1... {valid, sigs, ts} = STORED
 
[7] Consumer 2 requests same proof Cache: HIT 85ns
[8] Consumer 3 requests same proof Cache: HIT 85ns
[9] Consumer 1,000 requests same proof Cache: HIT 85ns
 
    1 verification. 999 reuses. 294x per consumer.

Run it yourself: brew install cachee && cachee-proof-reuse-demo

Implementation

What Gets Cached

The cached value is the verification result -- not the proof itself. A verification result is a boolean (valid/invalid) plus attestation metadata. Total size: approximately 33 bytes per cached entry plus signatures.

struct CachedVerification { fingerprint: [u8; 32], // SHA3-256 of proof + params result: bool, // valid or invalid verified_at: u64, // unix timestamp verifier_version: [u8; 4], // software version sig_mldsa: [u8; 3309], // ML-DSA-65 signature sig_falcon: [u8; 656], // FALCON-512 signature sig_slhdsa: [u8; 17088], // SLH-DSA signature } // Total: ~21KB per cached entry (signatures dominate). // The proof itself (10-100KB) is NOT stored unless explicitly archived.

The signatures are the bulk of the cached entry. This is by design: the three PQ signature families provide independent mathematical guarantees that the cached result is authentic. At 10 million cached verifications, the signature storage is approximately 210 GB -- significant, but far less than storing the proofs themselves (100 TB+ for 100KB STARKs).

Boundaries

When Proof Reuse Does Not Apply

Proof reuse works because verification is deterministic. In certain edge cases, the assumptions break:

In all three cases, the computation fingerprint handles the edge case by construction: different inputs produce different fingerprints, and TTL-based eviction handles time-bound proofs. The architecture accounts for these boundaries without special-casing.

Install

Get Started

brew tap h33ai-postquantum/tap && brew install cachee cachee init && cachee start # Cache a proof verification result SET proof:0xa3f1 verified FP <fingerprint_hex> # Retrieve the cached truth at 85ns GETVERIFIED proof:0xa3f1 # Independently verify a cached result offline cachee-verify --fingerprint 0xa3f1 --proof ./proof.bin --vk ./vk.bin

140+ Redis-compatible commands. Drop-in for existing infrastructure. The proof verification pipeline does not change -- you add a cache check before and after. One line of code to check, one line to store, zero lines to re-verify.

A verified proof is a fact. Stop re-verifying facts.

Verify once. Cache the result. Serve it to every consumer at 85 nanoseconds.

Install Cachee ZK Caching Guide

Deep Dives