Cold Cache

Caching

A cache with no stored content—every request is a miss. Occurs after a CDN deploy, server restart, new PoP launch, or cache purge. Cold caches temporarily increase origin load and user-facing latency until they warm up.

Updated Mar 9, 2026

Full Explanation

A cold cache is empty. It has nothing stored, so every single request becomes a cache miss that goes back to origin. This is the worst-case scenario for both latency (users wait for origin responses) and origin load (it handles 100% of traffic).

Cold caches happen more often than you'd think: deploying a new edge node, restarting a caching layer, switching CDN providers, or doing a full cache purge. The "warming up" period—where the cache gradually fills with popular content—can last minutes to hours depending on traffic patterns.

To mitigate cold cache impact, you can pre-warm caches by crawling your most popular URLs right after a deploy. Origin shields also help: even if the edge is cold, the shield layer might still have the content cached. And stale-while-revalidate ensures that even after a purge, you can serve slightly old content while fetching fresh copies.

Try the interactive Cache Fundamentals animation in the course to see how a cold cache gradually warms up as requests flow through.

Examples

Pre-warming a cache after deployment:

#!/bin/bash
# Crawl top 100 URLs to warm the cache
while IFS= read -r url; do
    curl -s -o /dev/null -w "%{http_code} %{url}\n" "$url" &
done < top-urls.txt
wait
echo "Cache pre-warm complete"

Monitoring cold cache recovery in Varnish:

# Watch hit rate recover after restart
watch -n 1 'varnishstat -1 | grep -E "(MAIN.cache_hit|MAIN.cache_miss)"'

# Output right after restart (cold):
# MAIN.cache_hit    12     0.40/s
# MAIN.cache_miss  988     32.93/s

# After 10 minutes (warming up):
# MAIN.cache_hit   8523   284.10/s
# MAIN.cache_miss  1477    49.23/s

Frequently Asked Questions

A cache with no stored content—every request is a miss. Occurs after a CDN deploy, server restart, new PoP launch, or cache purge. Cold caches temporarily increase origin load and user-facing latency until they warm up.

Pre-warming a cache after deployment:

#!/bin/bash
# Crawl top 100 URLs to warm the cache
while IFS= read -r url; do
    curl -s -o /dev/null -w "%{http_code} %{url}\n" "$url" &
done < top-urls.txt
wait
echo "Cache pre-warm complete"

Monitoring cold cache recovery in Varnish:

# Watch hit rate recover after restart
watch -n 1 'varnishstat -1 | grep -E "(MAIN.cache_hit|MAIN.cache_miss)"'

# Output right after restart (cold):
# MAIN.cache_hit    12     0.40/s
# MAIN.cache_miss  988     32.93/s

# After 10 minutes (warming up):
# MAIN.cache_hit   8523   284.10/s
# MAIN.cache_miss  1477    49.23/s

Related CDN concepts include:

  • Cache Fill — The process of populating a cache tier with content from upstream (shield or origin). A …
  • Cache Miss Types — Cold miss: First ever request (unavoidable). Capacity miss: Evicted due to full cache. Invalidation miss: …