Caching in Cloudflare Workers can be surprisingly complex. With multiple storage options available—Cache API, KV storage, and built-in fetch caching—each with different characteristics, pricing models, and limitations, I found myself writing repetitive caching logic across different projects.
The Problem: Caching Fragmentation
While working on my personal blog and other Cloudflare Workers projects, I encountered several caching challenges:
- Multiple storage options: Cache API, KV, and fetch caching all serve different use cases
- Inconsistent APIs: Each storage option has different interfaces and behaviors
- Pricing considerations: KV has write limits and storage costs, Cache API is free but regional
- Enterprise features: Some advanced caching features require expensive Enterprise plans
I needed a unified interface that could:
- Switch between caching strategies easily
- Handle the different APIs consistently
- Provide clear logging for debugging
- Work within the constraints of each storage option
The Solution: cf-cacher
I built cf-cacher, a flexible caching abstraction that simplifies switching between Cloudflare's caching options. Here's the core interface:
export type CfCacherProps = {
cacheKey: string
cacheName?: string
getRequest?: () => Request | Promise<Request>
getResponse?: () => Promise<Response>
cacheMode: 'fetch-cache' | 'cache-api' | 'kv' | 'none'
cacheTtl?: number
executionCtx?: {
waitUntil(promise: Promise<any>): void
}
kv?: KVNamespace
}
The function supports four caching modes:
1. Cache API Mode (cache-api)
Uses the standard Cache API with caches.open():
const cache = await caches.open(cacheName ?? 'custom:cache')
let response = await cache.match(cacheKey)
// getFreshResponse is a placeholder for your logic to generate or fetch the uncached response
const getFreshResponse = async () =>
(await getResponse?.()) || (await fetch(await getRequest?.()))
if (!response) {
response = await getFreshResponse()
response.headers.set('CDN-Cache-Control', `public, s-maxage=${cacheTtl}`)
await cache.put(cacheKey, response.clone())
}
Pros:
- Free and unlimited
- Fast response times
- Works with any Workers plan
Cons:
- Only caches in the specific data center where the Worker runs
- No global replication
2. KV Storage Mode (kv)
Stores responses in Cloudflare KV with metadata:
const kvResponse = await kv.getWithMetadata<KvMetadata>(cacheKey, 'stream')
if (kvResponse.value == null) {
// getFreshResponse is a placeholder for your logic to generate or fetch the uncached response
const getFreshResponse = async () =>
(await getResponse?.()) || (await fetch(await getRequest?.()))
response = await getFreshResponse()
await kv.put(cacheKey, response.clone().body, {
expirationTtl: cacheTtl,
metadata: { headers: Object.fromEntries(response.headers.entries()) },
})
}
Pros:
- Global replication across all edge locations
- Works with workers.dev projects
- Consistent responses worldwide
Cons:
- 1GB free storage limit, then $0.50/GB
- Write limits on free tier (100,000 writes/day)
- Higher latency than Cache API
3. Fetch Cache Mode (fetch-cache)
Uses Cloudflare's built-in caching with custom cache keys:
const response = await fetch(request, {
cf: { cacheKey, cacheEverything: true, cacheTtl },
})
Pros:
- Leverages Cloudflare's global CDN
- Automatic request coalescing
- Tiered caching support
Cons:
- Custom cache keys require Enterprise plan
- Only works within your own zone or non-Cloudflare sites
Real-World Usage: OG Image Generation
I use cf-cacher for my blog's OG image generation route. The images are expensive to generate (they involve font loading, SVG rendering, and PNG conversion), so caching is crucial:
export async function loader(args: Route.LoaderArgs) {
return cfCacher({
cacheKey: args.request.url,
getResponse: async () => {
const title = new URL(args.request.url).searchParams.get('title') || 'Hello'
return new ImageResponse(
<div style={{ /* styles */ }}>
<div>{title}</div>
</div>,
{ width: 1200, height: 630 }
)
},
executionCtx: args.context.cloudflare.ctx,
cacheMode: 'cache-api', // Cache in the edge data center
cacheTtl: seconds('1 hour')
})
}
The caching headers help with debugging:
response.headers.set('x-cache-method', cacheMode)
response.headers.set('x-cache-key', cacheKey)
response.headers.set('x-cache-status', 'HIT' | 'MISS')
Choosing the Right Cache Mode
Here's my decision matrix for selecting cache modes:
Use Cache API when:
- Content is region-specific
- You need unlimited free storage
- Fast response times are critical
- You're on a budget
Use KV when:
- Global consistency is required
- You have low traffic (to avoid write limits)
- Content changes infrequently
- You need workers.dev compatibility
Use Fetch Cache when:
- You have an Enterprise plan
- You're caching external API responses
- You want automatic request coalescing
- Content is cacheable by HTTP semantics
Performance Considerations
I've learned some important performance lessons:
Cache API Optimization
- Use custom cache names to avoid conflicts with
fetch()cache - Set appropriate
CDN-Cache-Controlheaders - Use
waitUntil()for non-blocking cache writes
KV Optimization
- Stream responses to avoid loading large content into memory
- Store headers as metadata to preserve response metadata
- Monitor write usage to avoid hitting limits
Debugging
The logging built into cf-cacher has been invaluable:
console.log(
`Executing cfCacher for ${cacheKey} with ${cacheMode} ttl ${cacheTtl} seconds`
)
console.log(`Cache hit for: ${cacheKey}.`)
console.log(
`Response for request url: ${cacheKey} not present in cache. Fetching and caching request.`
)
Future Enhancements
I'm considering several improvements:
- R2 Integration: For large files that exceed KV's 1GB limit
- Cache Warming: Background jobs to pre-populate cache
- Smart Mode Selection: Automatically choose based on content size and access patterns
- Metrics Collection: Track hit rates and performance across modes
The Code
The complete implementation is available in my blog's repository under app/server/cf-cacher/. It's designed to be framework-agnostic and can be dropped into any Cloudflare Workers project.
Conclusion
Building cf-cacher has significantly simplified my caching strategy across multiple projects. By abstracting away the differences between Cloudflare's storage options, I can:
- Easily switch between caching strategies as requirements change
- Avoid vendor lock-in to specific storage mechanisms
- Write cleaner, more maintainable caching code
- Make informed decisions about which storage to use based on actual requirements
The key insight was that caching isn't one-size-fits-all—different use cases benefit from different storage mechanisms. Having a unified interface lets me optimize for each specific scenario without rewriting caching logic.
If you're working with Cloudflare Workers and struggling with caching complexity, I hope this approach gives you some ideas for simplifying your own caching layer.