Steady-state is what you get forever. Burst is a 30-second window of 3× capacity — useful for backfills, imports, and pager-alert-driven sync. Limit is per dealership, shared across every key you've issued.
Four headers. Two of them are always present, two only on 429s. Watch Remaining — when it gets low, slow down.
You get back a 429, a Retry-After, and a small explanatory body. Don't just sleep and retry once — back off exponentially with jitter, cap at 32 seconds.
# Response headers
HTTP/2 429
retry-after: 12
x-ratelimit-limit: 1000
x-ratelimit-remaining: 0
x-ratelimit-reset: 1713800412
{
"error": {
"code": "rate_limited",
"message": "Slow down."
}
} // Respect Retry-After, then backoff async function call(req, attempt = 0) { const r = await fetch(req); if (r.status !== 429) return r; const after = +r.headers.get( 'retry-after') ?? 1; const jitter = Math.random() * 0.3; const wait = Math.min(32, after * (1 + jitter) * 2 ** attempt); await sleep(wait * 1000); return call(req, attempt + 1); }
Separate bucket. Up to 200 deliveries/sec per dealership. Fan-out to multiple endpoints does not multiply cost.
/v1/hulls/batch accepts up to 500 items and counts as one request. Use bulk for migrations and nightly syncs.
Fetch in chunks of 250. On Dock that's 4 requests/sec sustained — well under 1,000/min with room to spare.
The official TypeScript and Python SDKs handle all of this automatically: they inspect headers, track remaining budget, back off on 429s, and expose an optional concurrency limiter.