Skip to main content

Overview

The Vepler API implements rate limiting to ensure fair usage and maintain service stability. Rate limits are applied per API key and vary based on your subscription plan.

Rate Limit Tiers

PlanRequests/HourRequests/SecondConcurrent Requests
Free10022
Starter1,000105
Growth5,0005010
Scale20,00010025
EnterpriseCustomCustomCustom

Rate Limit Headers

Every API response includes headers with rate limit information:
X-RateLimit-Limit: 1000
X-RateLimit-Remaining: 950
X-RateLimit-Reset: 1640995200
X-RateLimit-Reset-After: 3600
X-RateLimit-Bucket: property-read
HeaderDescription
X-RateLimit-LimitMaximum requests allowed in current window
X-RateLimit-RemainingRequests remaining in current window
X-RateLimit-ResetUnix timestamp when the rate limit resets
X-RateLimit-Reset-AfterSeconds until the rate limit resets
X-RateLimit-BucketRate limit bucket identifier

Rate Limit Buckets

Different endpoints have separate rate limit buckets:
BucketEndpointsMultiplier
property-readProperty GET endpoints1x
property-writeProperty POST/PUT endpoints0.5x
planning-readPlanning GET endpoints1x
queryComplex query endpoints0.25x
exportData export endpoints0.1x

Handling Rate Limits

Check Remaining Requests

import { Vepler } from 'vepler-sdk';

const vepler = new Vepler({ apiKey: process.env.VEPLER_API_KEY });

const response = await vepler.property.get('p_0x000123456789');

// Check rate limit headers
const remaining = response.headers['x-ratelimit-remaining'];
const resetAfter = response.headers['x-ratelimit-reset-after'];

if (remaining < 10) {
  console.log(`Only ${remaining} requests remaining`);
  console.log(`Resets in ${resetAfter} seconds`);
}

Handle 429 Errors

When rate limited, the API returns a 429 status code:
try {
  const response = await vepler.property.get('p_0x000123456789');
} catch (error) {
  if (error.status === 429) {
    const retryAfter = error.headers['retry-after'];
    console.log(`Rate limited. Retry after ${retryAfter} seconds`);

    // Wait and retry
    await new Promise(resolve => setTimeout(resolve, retryAfter * 1000));
  }
}

Retry Strategies

Exponential Backoff

async function requestWithBackoff(fn, maxRetries = 3) {
  let lastError;

  for (let i = 0; i < maxRetries; i++) {
    try {
      return await fn();
    } catch (error) {
      if (error.status !== 429) throw error;

      lastError = error;
      const delay = Math.min(1000 * Math.pow(2, i), 30000);

      console.log(`Rate limited, retrying in ${delay}ms...`);
      await new Promise(resolve => setTimeout(resolve, delay));
    }
  }

  throw lastError;
}

// Usage
const data = await requestWithBackoff(() =>
  vepler.property.get('p_0x000123456789')
);

Adaptive Rate Control

class RateLimitedClient {
  constructor(apiKey) {
    this.client = new Vepler({ apiKey });
    this.requestsRemaining = 1000;
    this.resetTime = Date.now() + 3600000;
  }

  async request(fn) {
    // Check if we should wait
    if (this.requestsRemaining < 10) {
      const waitTime = this.resetTime - Date.now();
      if (waitTime > 0) {
        console.log(`Approaching limit, waiting ${waitTime}ms`);
        await new Promise(resolve => setTimeout(resolve, waitTime));
      }
    }

    const response = await fn(this.client);

    // Update rate limit info
    this.requestsRemaining = parseInt(
      response.headers['x-ratelimit-remaining']
    );
    this.resetTime = parseInt(
      response.headers['x-ratelimit-reset']
    ) * 1000;

    return response;
  }
}

Batch Processing

Optimize API usage with batch requests:
// Instead of individual requests
for (const id of propertyIds) {
  await vepler.property.get(id); // 100 requests
}

// Use batch endpoint
const response = await vepler.property.getMultiple({
  ids: propertyIds // 1 request
});

Queue Management

Implement a request queue to control rate:
class RequestQueue {
  constructor(maxPerSecond = 10) {
    this.queue = [];
    this.processing = false;
    this.delay = 1000 / maxPerSecond;
  }

  async add(fn) {
    return new Promise((resolve, reject) => {
      this.queue.push({ fn, resolve, reject });
      if (!this.processing) this.process();
    });
  }

  async process() {
    this.processing = true;

    while (this.queue.length > 0) {
      const { fn, resolve, reject } = this.queue.shift();

      try {
        const result = await fn();
        resolve(result);
      } catch (error) {
        reject(error);
      }

      if (this.queue.length > 0) {
        await new Promise(r => setTimeout(r, this.delay));
      }
    }

    this.processing = false;
  }
}

// Usage
const queue = new RequestQueue(10); // 10 requests per second

const properties = await Promise.all(
  propertyIds.map(id =>
    queue.add(() => vepler.property.get(id))
  )
);

Best Practices

Cache frequently accessed data to reduce API calls:
const cache = new Map();

async function getPropertyCached(id) {
  if (cache.has(id)) {
    return cache.get(id);
  }

  const data = await vepler.property.get(id);
  cache.set(id, data);

  // Expire after 1 hour
  setTimeout(() => cache.delete(id), 3600000);

  return data;
}
For real-time updates, use webhooks instead of polling:
// Instead of polling
setInterval(() => {
  vepler.property.checkUpdates();
}, 60000);

// Use webhooks (configured in dashboard)
app.post('/webhook/property-updated', (req, res) => {
  const property = req.body;
  // Handle update
});
Prevent cascading failures with circuit breakers:
class CircuitBreaker {
  constructor(threshold = 5, timeout = 60000) {
    this.failures = 0;
    this.threshold = threshold;
    this.timeout = timeout;
    this.state = 'CLOSED';
    this.nextAttempt = Date.now();
  }

  async execute(fn) {
    if (this.state === 'OPEN') {
      if (Date.now() < this.nextAttempt) {
        throw new Error('Circuit breaker is OPEN');
      }
      this.state = 'HALF_OPEN';
    }

    try {
      const result = await fn();
      if (this.state === 'HALF_OPEN') {
        this.state = 'CLOSED';
        this.failures = 0;
      }
      return result;
    } catch (error) {
      this.failures++;
      if (this.failures >= this.threshold) {
        this.state = 'OPEN';
        this.nextAttempt = Date.now() + this.timeout;
      }
      throw error;
    }
  }
}
Track your API usage to avoid surprises:
class UsageMonitor {
  constructor() {
    this.requests = [];
    this.hourlyLimit = 1000;
  }

  track(endpoint) {
    this.requests.push({
      endpoint,
      timestamp: Date.now()
    });

    // Clean old requests
    const hourAgo = Date.now() - 3600000;
    this.requests = this.requests.filter(
      r => r.timestamp > hourAgo
    );

    // Check usage
    if (this.requests.length > this.hourlyLimit * 0.8) {
      console.warn('Approaching rate limit: 80% used');
    }
  }

  getStats() {
    const now = Date.now();
    const hourAgo = now - 3600000;

    return {
      lastHour: this.requests.filter(
        r => r.timestamp > hourAgo
      ).length,
      perEndpoint: this.requests.reduce((acc, r) => {
        acc[r.endpoint] = (acc[r.endpoint] || 0) + 1;
        return acc;
      }, {})
    };
  }
}

Rate Limit Errors

{
  "error": {
    "code": "RATE_LIMIT_EXCEEDED",
    "message": "API rate limit exceeded",
    "details": {
      "limit": 1000,
      "remaining": 0,
      "reset_at": "2024-01-01T12:00:00Z",
      "retry_after": 3600
    }
  }
}

Increase Your Limits

Next Steps