Skip to content

OAuth Authorize Rate Limiting

Status: ✅ Implemented (2026-01-06) Priority: High Effort: 2-3 hours Category: Security Enhancement

Overview

OAuth Authorize Rate Limiting adds distributed rate limiting to the OAuth 2.0 authorization endpoint (GET /oauth/authorize) to protect against authorization request spam, DoS attacks via redirect loops, and distributed authorization flooding attacks.

Problem Statement

Prior to this enhancement:

  • OAuth authorization endpoint had no rate limiting protection
  • Vulnerable to automated authorization request spam
  • Could be exploited via redirect loops for DoS attacks
  • No protection against distributed authorization flooding
  • Attackers could overwhelm the proxy with authorization requests
  • No metrics to detect authorization abuse patterns

From GAP_ANALYSIS.md section 8.1:

The OAuth authorization endpoint lacks rate limiting, making it vulnerable to abuse through automated authorization requests, redirect loop attacks, and distributed flooding. This could lead to service degradation and poor user experience during legitimate login flows.

Solution

Apply the existing distributed rate limiting middleware to the OAuth authorize endpoint with appropriate limits:

  • Per-IP limit: 10 requests per minute
  • Global limit: 1,000 requests per minute across all IPs
  • Window: 1-minute sliding window
  • Graceful degradation: Fails open if Redis unavailable

Implementation

Rate Limiting Configuration

File: src/config/rate-limit.ts

Added OAuth authorize endpoint configuration:

typescript
/** OAuth Authorization endpoint rate limiting */
authorize: {
  /** Time window in milliseconds */
  windowMs: parseInt(process.env.OAUTH_AUTHORIZE_RATE_LIMIT_WINDOW_MS ?? "60000", 10), // 1 minute
  /** Maximum requests per window per IP */
  maxRequests: parseInt(process.env.OAUTH_AUTHORIZE_RATE_LIMIT_MAX ?? "10", 10), // 10 authorizations
  /** Global maximum across all IPs (protects against distributed attacks) */
  globalMax: parseInt(process.env.OAUTH_AUTHORIZE_GLOBAL_RATE_LIMIT_MAX ?? "1000", 10), // 1,000 authorizations
},

Middleware Application

File: src/routes/oauth-authorize.ts

Applied rate limiter at router level:

typescript
import { createDistributedRateLimiter } from "../middleware/distributed-rate-limit.js";

export const oauthAuthorizeRouter = Router();

// Apply distributed rate limiting to OAuth authorize endpoint
if (config.rateLimit.enabled) {
  oauthAuthorizeRouter.use(
    createDistributedRateLimiter({
      windowMs: config.rateLimit.authorize.windowMs,
      maxRequests: config.rateLimit.authorize.maxRequests,
      globalMax: config.rateLimit.authorize.globalMax,
      keyPrefix: "ratelimit:authorize:",
      endpointType: "authorize",
    }),
  );
}

Key Design Decisions:

  1. Router-level middleware: Applied to entire router, protecting all routes
  2. Conditional application: Only enabled when config.rateLimit.enabled is true
  3. Consistent pattern: Follows same pattern as DCR endpoint rate limiting
  4. Unique key prefix: Uses "ratelimit:authorize:" for Redis keys

Test Coverage

File: src/routes/oauth-authorize.test.ts

Added comprehensive rate limiting tests:

1. Per-IP Rate Limit Test:

typescript
it("should return 429 when per-IP rate limit exceeded", async () => {
  // ... test makes 11 requests (exceeding 10 req/min limit)
  expect(response.status).toBe(429);
  expect(response.body).toEqual({
    error: "too_many_requests",
    error_description: "Rate limit exceeded. Please try again later.",
  });
  expect(response.headers["retry-after"]).toBeDefined();
});

2. Global Rate Limit Test:

typescript
it("should return 503 when global rate limit exceeded", async () => {
  // Mock Redis to return count at global limit
  expect(response.status).toBe(503);
  expect(response.body).toEqual({
    error: "service_unavailable",
    error_description: "Service temporarily unavailable due to high load.",
  });
});

Test Infrastructure:

  • Redis client mocking with configurable responses
  • Metrics mocking to prevent side effects
  • Dynamic module imports for testing rate limiting scenarios
  • Config mocking to enable/disable rate limiting per test

Benefits

Security

  1. DoS Protection: Prevents attackers from overwhelming the authorization endpoint
  2. Redirect Loop Mitigation: Rate limits prevent infinite redirect attacks
  3. Distributed Attack Defense: Global limit protects against coordinated attacks from multiple IPs
  4. Resource Protection: Prevents authorization request spam from consuming server resources

Operations

  1. Abuse Detection: Metrics track authorization request patterns and rejections
  2. Service Stability: Ensures legitimate users can access during attacks
  3. Performance: Redis-backed rate limiting scales horizontally
  4. Observability: Structured logging and Prometheus metrics for monitoring

User Experience

  1. Fair Access: Prevents single users from monopolizing authorization capacity
  2. Quick Recovery: 1-minute window allows rapid recovery from temporary blocks
  3. Clear Feedback: 429 responses include Retry-After header
  4. Minimal Impact: Limits are generous for legitimate use (10 req/min per IP)

Configuration

Environment Variables

bash
# OAuth authorize endpoint rate limiting
OAUTH_AUTHORIZE_RATE_LIMIT_WINDOW_MS=60000   # 1 minute window
OAUTH_AUTHORIZE_RATE_LIMIT_MAX=10            # 10 requests per IP per minute
OAUTH_AUTHORIZE_GLOBAL_RATE_LIMIT_MAX=1000   # 1,000 requests globally per minute

# Global rate limiting toggle
RATE_LIMIT_ENABLED=true                      # Enable/disable all rate limiting

# Redis (required for rate limiting)
REDIS_URL=redis://redis:6379

Default Values

If environment variables are not set, the following defaults apply:

  • Window: 60,000ms (1 minute)
  • Per-IP limit: 10 requests
  • Global limit: 1,000 requests

Rationale for Limits

Per-IP Limit (10 req/min):

  • OAuth authorization is typically initiated once per login flow
  • Allows for retries due to user errors or network issues
  • Prevents automated spam from single sources
  • Conservative enough to block abuse, generous enough for legitimate use

Global Limit (1,000 req/min):

  • Protects against distributed attacks across many IPs
  • Allows ~16.7 authorizations per second globally
  • Scaled for moderate production load
  • Prevents service degradation during attacks

Window Duration (1 minute):

  • Short enough for quick recovery from temporary blocks
  • Long enough to prevent rapid retry attacks
  • Matches user-facing operation patterns
  • Consistent with MCP endpoint rate limiting

Response Formats

Success (Under Limit)

http
HTTP/1.1 302 Found
Location: https://idp.example.com/oauth/authorize?client_id=...
X-RateLimit-Limit: 10
X-RateLimit-Remaining: 7
X-RateLimit-Reset: 1704643260

Per-IP Limit Exceeded

http
HTTP/1.1 429 Too Many Requests
X-RateLimit-Limit: 10
X-RateLimit-Remaining: 0
X-RateLimit-Reset: 1704643260
Retry-After: 45
Content-Type: application/json

{
  "error": "too_many_requests",
  "error_description": "Rate limit exceeded. Please try again later."
}

Global Limit Exceeded

http
HTTP/1.1 503 Service Unavailable
Retry-After: 45
Content-Type: application/json

{
  "error": "service_unavailable",
  "error_description": "Service temporarily unavailable due to high load."
}

Observability

Prometheus Metrics

The rate limiting middleware emits metrics tracked by endpoint type:

promql
# OAuth authorize rejection rate
rate(http_request_rate_limit_requests_total{endpoint="authorize",limited="true"}[5m])

# OAuth authorize success rate
rate(http_request_rate_limit_requests_total{endpoint="authorize",limited="false"}[5m])

# OAuth authorize total load
sum(rate(http_request_rate_limit_requests_total{endpoint="authorize"}[5m]))

# OAuth authorize abuse detection (alert if > 0.1/sec)
rate(http_request_rate_limit_requests_total{endpoint="authorize",limited="true"}[5m]) > 0.1

Structured Logging

Request Allowed:

json
{
  "level": "debug",
  "message": "Rate limit check passed",
  "endpoint": "authorize",
  "ip": "192.168.1.100",
  "count": 3,
  "limit": 10
}

Request Blocked:

json
{
  "level": "warn",
  "message": "Rate limit exceeded",
  "endpoint": "authorize",
  "ip": "192.168.1.100",
  "count": 11,
  "limit": 10,
  "retryAfter": 60
}

Testing

Manual Testing

Test per-IP rate limiting:

bash
# Make 11 requests rapidly
for i in {1..11}; do
  curl -i "http://localhost:8080/oauth/authorize?client_id=test&redirect_uri=http://localhost:3000/callback"
  echo "Request $i"
done

# Expected: First 10 succeed (302), 11th fails (429)

Test with different IPs:

bash
# Each IP gets independent limit
for ip in "192.168.1.1" "192.168.1.2" "192.168.1.3"; do
  curl -i -H "X-Forwarded-For: $ip" \
    "http://localhost:8080/oauth/authorize?client_id=test&redirect_uri=http://localhost:3000/callback"
done

Automated Testing

Run rate limiting tests:

bash
# Run all OAuth authorize tests
npx vitest run src/routes/oauth-authorize.test.ts

# Run specific rate limiting tests
npx vitest run -t "rate limit"

Architecture Integration

  1. Rate Limiting Middleware (src/middleware/distributed-rate-limit.ts)

    • Shared rate limiting implementation
    • Redis-backed sliding window algorithm
    • Consistent behavior across endpoints
  2. Redis Service (src/services/redis.ts)

    • Provides Redis client for rate limiting
    • Handles connection management
    • Implements graceful degradation
  3. Metrics Service (src/services/metrics.ts)

    • Tracks rate limiting hits and usage
    • Exports Prometheus metrics
    • Enables monitoring and alerting

Comparison with Other Endpoints

EndpointWindowPer-IP LimitGlobal LimitRationale
OAuth Authorize1 min101,000User-facing, login flow, DoS protection
MCP1 min10010,000High-value protocol, frequent operations
DCR1 hour101,000Infrequent operation, strict abuse prevention

Limitations

  1. No Exponential Backoff: Clients are not forced to increase retry intervals after repeated rejections
  2. IP-Based Only: Does not consider authenticated user identity (endpoint is public)
  3. Static Limits: Limits are not dynamically adjusted based on system load
  4. No Client Prioritization: All clients treated equally, no premium/trusted client tiers

Future Enhancements

  1. Dynamic Limits: Adjust limits based on system load and Redis capacity
  2. Client Allowlisting: Bypass rate limiting for trusted client IPs
  3. Enhanced Metrics: Add percentile tracking for request distribution
  4. Rate Limit Headers: Add standard rate limit headers per draft-ietf-httpapi-ratelimit-headers
  5. Exponential Backoff: Implement increasing penalties for repeated violations

Implementation Date

Completed: 2026-01-06 Actual Effort: 2.5 hours Test Coverage: 2 rate limiting tests (per-IP and global limits) Validation: All tests passing (735/736, 1 skipped)

References

  • src/routes/oauth-authorize.ts - OAuth Authorize Route Implementation
  • src/config/rate-limit.ts - Rate Limiting Configuration
  • src/middleware/distributed-rate-limit.ts - Distributed Rate Limiter Middleware
  • RFC 6749 - OAuth 2.0 Authorization Framework

Released under the MIT License.