Rate Limits
Turnpike implements rate limiting to ensure fair usage and maintain service quality for all users.
Rate Limit Tiers
Standard
100
120
Premium
1,000
1,200
Enterprise
Custom
Custom
Rate Limit Headers
Every API response includes rate limit information in the headers:
X-RateLimit-Limit: 100
X-RateLimit-Remaining: 87
X-RateLimit-Reset: 1697234567X-RateLimit-Limit
Maximum requests allowed in the current window
X-RateLimit-Remaining
Requests remaining in the current window
X-RateLimit-Reset
Unix timestamp when the rate limit resets
Handling Rate Limits
Rate Limit Exceeded Response
When you exceed your rate limit, you'll receive a 429 Too Many Requests response:
{
"success": false,
"error": {
"code": "RATE_LIMIT_EXCEEDED",
"message": "Rate limit exceeded. Please retry after 30 seconds.",
"retryAfter": 30
},
"timestamp": 1697234567890
}Best Practices
1. Implement Exponential Backoff
async function makeRequestWithBackoff(url, options, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
const response = await fetch(url, options);
if (response.status === 429) {
const retryAfter = response.headers.get('Retry-After') || Math.pow(2, i);
await new Promise(resolve => setTimeout(resolve, retryAfter * 1000));
continue;
}
return response;
}
throw new Error('Max retries exceeded');
}2. Monitor Rate Limit Headers
function checkRateLimit(response) {
const remaining = parseInt(response.headers.get('X-RateLimit-Remaining'));
const limit = parseInt(response.headers.get('X-RateLimit-Limit'));
if (remaining < limit * 0.1) {
console.warn('Approaching rate limit:', remaining, 'requests remaining');
}
}3. Batch Requests When Possible
Instead of making multiple individual requests, batch them when the API supports it:
// Bad: Multiple requests
for (const mint of mints) {
await getTokenInfo(mint);
}
// Good: Single batched request (if supported)
await getTokenInfoBatch(mints);4. Cache Responses
Cache API responses when data doesn't change frequently:
const cache = new Map();
async function getTokenInfoCached(mint, ttl = 60000) {
const cached = cache.get(mint);
if (cached && Date.now() - cached.timestamp < ttl) {
return cached.data;
}
const data = await getTokenInfo(mint);
cache.set(mint, { data, timestamp: Date.now() });
return data;
}WebSocket Rate Limits
WebSocket connections have separate rate limits:
Standard
5
50
Premium
20
200
Enterprise
Custom
Custom
Endpoint-Specific Limits
Some endpoints have additional rate limits:
Trading Endpoints
/api/trade/buyand/api/trade/sell: Limited to 10 trades/minute on Standard planNo additional limits on Premium and Enterprise plans
Data Endpoints
Token info and portfolio endpoints follow standard rate limits
No endpoint-specific restrictions
Upgrading Your Plan
If you consistently hit rate limits, consider upgrading:
Visit turnpike.dev/pricing
Compare plans and select one that fits your needs
Upgrade instantly with no downtime
Your new rate limits take effect immediately
Monitoring Usage
Track your API usage in the dashboard:
Log in to turnpike.dev
Navigate to "API Usage"
View real-time metrics and historical data
Set up alerts for approaching limits
Enterprise Plans
Need custom rate limits? Contact our sales team for enterprise pricing:
Custom rate limits tailored to your needs
Dedicated infrastructure
Priority support
SLA guarantees
Contact: [email protected]
Last updated