Skip to main content

Nginx Caching: proxy_cache Complete Guide

Nginx's proxy cache stores backend responses on disk or in memory so that repeated identical requests are served immediately without hitting the backend. Properly configured, it can reduce response times by orders of magnitude and cut backend load by over 90%.


How Caching Works​

Nginx Caching Flow

On a cache HIT, Nginx returns the stored response immediately without calling the backend. Only on a MISS or EXPIRED does it forward to the backend and store the response.


Basic proxy_cache Configuration​

Step 1: Define a Cache Zone​

# /etc/nginx/nginx.conf β€” add to the http block

http {
# Cache zone definition
# levels=1:2 : directory hierarchy (spreads files)
# keys_zone=my_cache:10m : 10 MB for cache keys (~80,000 entries)
# max_size=1g : maximum cache disk size
# inactive=60m : evict entries not accessed for 60 minutes
# use_temp_path=off : write directly (better performance)
proxy_cache_path /var/cache/nginx
levels=1:2
keys_zone=my_cache:10m
max_size=1g
inactive=60m
use_temp_path=off;
}

Step 2: Enable the Cache in a server Block​

server {
listen 443 ssl;
server_name example.com;

location / {
proxy_pass http://backend;

# Cache zone name (defined above)
proxy_cache my_cache;

# Cache key: unique string to identify each request
proxy_cache_key "$scheme$request_method$host$request_uri";

# Cache TTL per status code
proxy_cache_valid 200 302 10m; # success: 10 minutes
proxy_cache_valid 301 1h; # permanent redirect: 1 hour
proxy_cache_valid any 1m; # everything else (404, etc.): 1 min

# Expose cache status in response header (for debugging)
add_header X-Cache-Status $upstream_cache_status;
}
}

Cache Status ($upstream_cache_status)​

ValueMeaning
HITServed from cache (no backend call)
MISSNot cached β†’ backend called, response cached
EXPIREDCache expired β†’ backend re-requested
BYPASSCache bypassed (conditional bypass rule)
STALEExpired cache used (on backend error)
UPDATINGCache refreshing β†’ previous cache served
REVALIDATED304 Not Modified β†’ existing cache confirmed valid

Cache-Control Header Integration​

Nginx can respect Cache-Control headers returned by the backend.

location / {
proxy_pass http://backend;
proxy_cache my_cache;

# Ignore backend Cache-Control and force caching (caution!)
proxy_ignore_headers Cache-Control Expires;

# Or respect backend Cache-Control (default behavior)
# proxy_cache_valid 200 10m; ← only applies when Cache-Control is absent
}

Controlling caching from Spring Boot:

@GetMapping("/api/products")
public ResponseEntity<List<Product>> getProducts() {
List<Product> products = productService.findAll();
return ResponseEntity.ok()
.cacheControl(CacheControl.maxAge(10, TimeUnit.MINUTES)) // Nginx caches 10 min
.body(products);
}

@GetMapping("/api/user/profile")
public ResponseEntity<User> getProfile() {
// User-specific data must not be cached
return ResponseEntity.ok()
.cacheControl(CacheControl.noStore())
.body(userService.getCurrentUser());
}

Cache Bypass Configuration​

Authenticated users and POST requests must bypass the cache.

# Map request method to bypass flag
map $request_method $bypass_cache {
POST 1;
PUT 1;
DELETE 1;
default 0;
}

server {
location /api/ {
proxy_pass http://backend;
proxy_cache my_cache;

# Bypass when POST/PUT/DELETE or session cookie exists
proxy_cache_bypass $bypass_cache $cookie_session_id;
proxy_no_cache $bypass_cache $cookie_session_id;

proxy_cache_valid 200 5m;
}

# Long-lived cache for static assets
location ~* \.(css|js|png|jpg|gif|ico|woff2)$ {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_valid 200 7d; # 1 week
expires 7d;
add_header Cache-Control "public, immutable";
}
}

Stale-While-Revalidate​

Serve the previous cached response while the backend refreshes it.

location / {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_valid 200 10m;

# After 10 min, serve stale for up to 1 hour while refreshing
proxy_cache_use_stale error timeout updating
http_500 http_502 http_503 http_504;

# Refresh cache in the background (caller gets stale immediately)
proxy_cache_background_update on;

# Send only 1 refresh request per cache key at a time
proxy_cache_lock on;
proxy_cache_lock_timeout 5s;
}

Cache Invalidation (Purge)​

Remove a cached response immediately when content changes.

Option 1: proxy_cache_purge Module (Nginx Plus or ngx_cache_purge)​

location ~ /purge(/.*) {
allow 127.0.0.1;
allow 10.0.0.0/8;
deny all;

proxy_cache_purge my_cache "$scheme$request_method$host$1";
}
curl -X PURGE https://example.com/api/products

Option 2: Delete Cache Files Directly​

# Clear entire cache directory (no Nginx restart needed)
find /var/cache/nginx -type f -delete

# Find and delete cache file for a specific URL
grep -r "example.com/api/products" /var/cache/nginx -l | xargs rm -f
<script src="/static/app.js?v=20240315"></script>
<link rel="stylesheet" href="/static/style.css?v=1.2.3">

Cache Performance Monitoring​

# Check cache directory size
du -sh /var/cache/nginx/

# Analyze HIT/MISS ratio (requires X-Cache-Status in access.log)
awk '{print $NF}' /var/log/nginx/access.log | sort | uniq -c | sort -rn
# 12453 HIT
# 1230 MISS
# 89 EXPIRED

# Add cache status to log format
log_format cache_log '$remote_addr - [$time_local] "$request" '
'$status $body_bytes_sent '
'"$upstream_cache_status" $request_time';

access_log /var/log/nginx/access.log cache_log;

Cache Design Guidelines​

Resource TypeCache TTLNotes
Static files (css, js, img)7 days – 1 yearUse versioned URLs
API list / search results1–10 minutesLow real-time requirement
User-specific dataNo cacheproxy_no_cache
POST / PUT / DELETENo cacheExcluded automatically
Health check endpointsNo cacheMust reflect real-time state