If you’re building JSON-RPC APIs, GraphQL endpoints, or any POST-based API on Vercel with Astro (or Next.js), you might encounter a frustrating caching issue where different requests return the same cached response. This post documents the problem, what doesn’t work, and the actual solution.
The Problem
I was building A2A (Agent-to-Agent) protocol endpoints for this blog—JSON-RPC 2.0 APIs that let AI agents query blog content. The endpoints worked perfectly in local development, but in production on Vercel, something strange happened:
# Request 1: Get site metadata
curl -X POST 'https://example.com/api/a2a/blog/metadata' \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","id":1,"params":{"type":"site"}}'
# Returns: {"name": "My Blog", "description": "..."} ✓ Correct!
# Request 2: Get stats (different params, same URL)
curl -X POST 'https://example.com/api/a2a/blog/metadata' \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","id":2,"params":{"type":"stats"}}'
# Returns: {"name": "My Blog", "description": "..."} ✗ Wrong! Should be stats!
The second request returned the exact same response as the first, even though the POST body was completely different. The type parameter was being ignored.
Identifying the Symptoms
Here’s how to tell if you’re hitting this issue:
1. Check the Response Headers
curl -i -X POST 'https://your-site.com/api/endpoint' \
-H 'Content-Type: application/json' \
-d '{"your":"payload"}'
Look for these headers:
x-vercel-cache: HIT— Vercel served this from cacheage: 47— Response has been cached for 47 seconds
If you see HIT on POST requests that should return different data based on the body, you’ve found the problem.
2. Different POST Bodies, Same Response
The clearest symptom: sending different JSON payloads to the same URL endpoint returns identical responses after the first request.
What Doesn’t Work
I spent hours trying various approaches. Here’s what didn’t fix the issue:
1. Cache-Control Headers
headers.set('Cache-Control', 'private, no-store, no-cache, must-revalidate');
headers.set('Pragma', 'no-cache');
headers.set('Expires', '0');
Vercel’s ISR ignores these for determining cache behavior.
2. CDN-Specific Headers
headers.set('CDN-Cache-Control', 'no-store');
headers.set('Surrogate-Control', 'no-store');
headers.set('Vercel-CDN-Cache-Control', 'no-store');
Also ignored by ISR.
3. The x-vercel-no-cache Header
headers.set('x-vercel-no-cache', '1');
This header is supposed to tell Vercel not to cache the response. It gets added to the response, but ISR still caches anyway.
4. Cache-Busting Query Parameters
curl -X POST 'https://example.com/api/endpoint?_cb=unique123'
ISR explicitly ignores query parameters when determining cache keys. From the Astro Vercel adapter docs:
ISR function requests exclude search parameters, similar to static mode behavior.
5. vercel.json Headers Configuration
{
"headers": [
{
"source": "/api/a2a/blog/(.*)",
"headers": [
{ "key": "Cache-Control", "value": "no-store" },
{ "key": "x-vercel-no-cache", "value": "1" }
]
}
]
}
Headers in vercel.json also don’t override ISR behavior.
6. Wildcard ISR Exclusions
// astro.config.mjs
adapter: vercel({
isr: {
exclude: ['/api/a2a/*'], // Doesn't match nested paths!
},
})
The wildcard pattern /api/a2a/* didn’t match /api/a2a/blog/metadata. Astro/Vercel’s pattern matching is more literal than expected.
The Actual Solution
The fix is to explicitly list each route in the ISR exclude configuration:
// astro.config.mjs
import { defineConfig } from "astro/config"
import vercel from "@astrojs/vercel"
export default defineConfig({
output: "server",
adapter: vercel({
isr: {
// Exclude API routes that depend on POST body content
// Each request must be processed independently
exclude: [
'/api/a2a/blog/list',
'/api/a2a/blog/get',
'/api/a2a/blog/search',
'/api/a2a/blog/metadata',
'/api/a2a/blog/author',
'/api/a2a/blog/related',
'/api/a2a/blog/tags',
'/api/a2a/service',
],
},
}),
})
After deploying this change and purging the CDN cache:
vercel cache purge --type cdn --yes
Each POST request is now processed independently, and the correct response is returned based on the actual request body.
Why This Happens
Vercel’s ISR (Incremental Static Regeneration) is designed to cache responses for better performance. The cache key is based on:
- URL path —
/api/a2a/blog/metadata - HTTP method — POST (yes, ISR caches POST requests!)
What ISR does NOT consider for the cache key:
- Request body content
- Query string parameters
- Cache-Control headers
- Custom headers like
x-vercel-no-cache
This makes sense for traditional page rendering where the URL determines the content. But for JSON-RPC, GraphQL, or any API where the POST body determines the response, this behavior breaks everything.
When You Need This Fix
You need to exclude routes from ISR if your endpoint:
- Uses JSON-RPC where the method/params are in the POST body
- Implements GraphQL where queries are POST payloads
- Has any POST API where the response depends on the request body
- Returns different data for the same URL based on request content
Verifying the Fix
After deploying, verify with:
# Check that x-vercel-cache shows MISS (not HIT)
curl -i -X POST 'https://your-site.com/api/endpoint' \
-H 'Content-Type: application/json' \
-d '{"your":"payload"}'
# Look for:
# x-vercel-cache: MISS
# (or no x-vercel-cache header at all)
You can also run the same request twice with different payloads and confirm you get different responses.
For Next.js Users
If you’re using Next.js instead of Astro, the equivalent solution is to configure the route segment:
// app/api/your-endpoint/route.ts
export const dynamic = 'force-dynamic'
export const revalidate = 0
export async function POST(request: Request) {
// Your handler
}
Or in next.config.js for specific paths.
Performance Considerations
Excluding routes from ISR means every request hits your serverless function. For high-traffic APIs, consider:
- Implement your own caching — Use Redis, Upstash, or in-memory caching based on the actual request content
- Use Edge Functions — They have different caching behavior and may work better for your use case
- Add rate limiting — Protect against abuse since requests aren’t cached
For my A2A endpoints, the trade-off is worth it. Correctness is more important than caching responses that depend on request body content.
Conclusion
Vercel’s ISR is powerful for static and semi-static content, but it’s a footgun for POST-based APIs. The key insight is that ISR caches by URL path only—it completely ignores what’s in your POST body.
The fix is simple once you know it: explicitly exclude your API routes from ISR in your Astro (or Next.js) configuration.
I hope this saves someone else the hours of debugging I went through. If you’re building JSON-RPC or similar APIs on Vercel, check your ISR configuration first!
Have questions or found another edge case? Feel free to reach out on Twitter or LinkedIn.