OAuth Authorize Rate Limiting
Status: ✅ Implemented (2026-01-06) Priority: High Effort: 2-3 hours Category: Security Enhancement
Overview
OAuth Authorize Rate Limiting adds distributed rate limiting to the OAuth 2.0 authorization endpoint (GET /oauth/authorize) to protect against authorization request spam, DoS attacks via redirect loops, and distributed authorization flooding attacks.
Problem Statement
Prior to this enhancement:
- OAuth authorization endpoint had no rate limiting protection
- Vulnerable to automated authorization request spam
- Could be exploited via redirect loops for DoS attacks
- No protection against distributed authorization flooding
- Attackers could overwhelm the proxy with authorization requests
- No metrics to detect authorization abuse patterns
From GAP_ANALYSIS.md section 8.1:
The OAuth authorization endpoint lacks rate limiting, making it vulnerable to abuse through automated authorization requests, redirect loop attacks, and distributed flooding. This could lead to service degradation and poor user experience during legitimate login flows.
Solution
Apply the existing distributed rate limiting middleware to the OAuth authorize endpoint with appropriate limits:
- Per-IP limit: 10 requests per minute
- Global limit: 1,000 requests per minute across all IPs
- Window: 1-minute sliding window
- Graceful degradation: Fails open if Redis unavailable
Implementation
Rate Limiting Configuration
File: src/config/rate-limit.ts
Added OAuth authorize endpoint configuration:
/** OAuth Authorization endpoint rate limiting */
authorize: {
/** Time window in milliseconds */
windowMs: parseInt(process.env.OAUTH_AUTHORIZE_RATE_LIMIT_WINDOW_MS ?? "60000", 10), // 1 minute
/** Maximum requests per window per IP */
maxRequests: parseInt(process.env.OAUTH_AUTHORIZE_RATE_LIMIT_MAX ?? "10", 10), // 10 authorizations
/** Global maximum across all IPs (protects against distributed attacks) */
globalMax: parseInt(process.env.OAUTH_AUTHORIZE_GLOBAL_RATE_LIMIT_MAX ?? "1000", 10), // 1,000 authorizations
},2
3
4
5
6
7
8
9
Middleware Application
File: src/routes/oauth-authorize.ts
Applied rate limiter at router level:
import { createDistributedRateLimiter } from "../middleware/distributed-rate-limit.js";
export const oauthAuthorizeRouter = Router();
// Apply distributed rate limiting to OAuth authorize endpoint
if (config.rateLimit.enabled) {
oauthAuthorizeRouter.use(
createDistributedRateLimiter({
windowMs: config.rateLimit.authorize.windowMs,
maxRequests: config.rateLimit.authorize.maxRequests,
globalMax: config.rateLimit.authorize.globalMax,
keyPrefix: "ratelimit:authorize:",
endpointType: "authorize",
}),
);
}2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Key Design Decisions:
- Router-level middleware: Applied to entire router, protecting all routes
- Conditional application: Only enabled when
config.rateLimit.enabledis true - Consistent pattern: Follows same pattern as DCR endpoint rate limiting
- Unique key prefix: Uses
"ratelimit:authorize:"for Redis keys
Test Coverage
File: src/routes/oauth-authorize.test.ts
Added comprehensive rate limiting tests:
1. Per-IP Rate Limit Test:
it("should return 429 when per-IP rate limit exceeded", async () => {
// ... test makes 11 requests (exceeding 10 req/min limit)
expect(response.status).toBe(429);
expect(response.body).toEqual({
error: "too_many_requests",
error_description: "Rate limit exceeded. Please try again later.",
});
expect(response.headers["retry-after"]).toBeDefined();
});2
3
4
5
6
7
8
9
2. Global Rate Limit Test:
it("should return 503 when global rate limit exceeded", async () => {
// Mock Redis to return count at global limit
expect(response.status).toBe(503);
expect(response.body).toEqual({
error: "service_unavailable",
error_description: "Service temporarily unavailable due to high load.",
});
});2
3
4
5
6
7
8
Test Infrastructure:
- Redis client mocking with configurable responses
- Metrics mocking to prevent side effects
- Dynamic module imports for testing rate limiting scenarios
- Config mocking to enable/disable rate limiting per test
Benefits
Security
- DoS Protection: Prevents attackers from overwhelming the authorization endpoint
- Redirect Loop Mitigation: Rate limits prevent infinite redirect attacks
- Distributed Attack Defense: Global limit protects against coordinated attacks from multiple IPs
- Resource Protection: Prevents authorization request spam from consuming server resources
Operations
- Abuse Detection: Metrics track authorization request patterns and rejections
- Service Stability: Ensures legitimate users can access during attacks
- Performance: Redis-backed rate limiting scales horizontally
- Observability: Structured logging and Prometheus metrics for monitoring
User Experience
- Fair Access: Prevents single users from monopolizing authorization capacity
- Quick Recovery: 1-minute window allows rapid recovery from temporary blocks
- Clear Feedback: 429 responses include
Retry-Afterheader - Minimal Impact: Limits are generous for legitimate use (10 req/min per IP)
Configuration
Environment Variables
# OAuth authorize endpoint rate limiting
OAUTH_AUTHORIZE_RATE_LIMIT_WINDOW_MS=60000 # 1 minute window
OAUTH_AUTHORIZE_RATE_LIMIT_MAX=10 # 10 requests per IP per minute
OAUTH_AUTHORIZE_GLOBAL_RATE_LIMIT_MAX=1000 # 1,000 requests globally per minute
# Global rate limiting toggle
RATE_LIMIT_ENABLED=true # Enable/disable all rate limiting
# Redis (required for rate limiting)
REDIS_URL=redis://redis:63792
3
4
5
6
7
8
9
10
Default Values
If environment variables are not set, the following defaults apply:
- Window: 60,000ms (1 minute)
- Per-IP limit: 10 requests
- Global limit: 1,000 requests
Rationale for Limits
Per-IP Limit (10 req/min):
- OAuth authorization is typically initiated once per login flow
- Allows for retries due to user errors or network issues
- Prevents automated spam from single sources
- Conservative enough to block abuse, generous enough for legitimate use
Global Limit (1,000 req/min):
- Protects against distributed attacks across many IPs
- Allows ~16.7 authorizations per second globally
- Scaled for moderate production load
- Prevents service degradation during attacks
Window Duration (1 minute):
- Short enough for quick recovery from temporary blocks
- Long enough to prevent rapid retry attacks
- Matches user-facing operation patterns
- Consistent with MCP endpoint rate limiting
Response Formats
Success (Under Limit)
HTTP/1.1 302 Found
Location: https://idp.example.com/oauth/authorize?client_id=...
X-RateLimit-Limit: 10
X-RateLimit-Remaining: 7
X-RateLimit-Reset: 17046432602
3
4
5
Per-IP Limit Exceeded
HTTP/1.1 429 Too Many Requests
X-RateLimit-Limit: 10
X-RateLimit-Remaining: 0
X-RateLimit-Reset: 1704643260
Retry-After: 45
Content-Type: application/json
{
"error": "too_many_requests",
"error_description": "Rate limit exceeded. Please try again later."
}2
3
4
5
6
7
8
9
10
11
Global Limit Exceeded
HTTP/1.1 503 Service Unavailable
Retry-After: 45
Content-Type: application/json
{
"error": "service_unavailable",
"error_description": "Service temporarily unavailable due to high load."
}2
3
4
5
6
7
8
Observability
Prometheus Metrics
The rate limiting middleware emits metrics tracked by endpoint type:
# OAuth authorize rejection rate
rate(http_request_rate_limit_requests_total{endpoint="authorize",limited="true"}[5m])
# OAuth authorize success rate
rate(http_request_rate_limit_requests_total{endpoint="authorize",limited="false"}[5m])
# OAuth authorize total load
sum(rate(http_request_rate_limit_requests_total{endpoint="authorize"}[5m]))
# OAuth authorize abuse detection (alert if > 0.1/sec)
rate(http_request_rate_limit_requests_total{endpoint="authorize",limited="true"}[5m]) > 0.12
3
4
5
6
7
8
9
10
11
Structured Logging
Request Allowed:
{
"level": "debug",
"message": "Rate limit check passed",
"endpoint": "authorize",
"ip": "192.168.1.100",
"count": 3,
"limit": 10
}2
3
4
5
6
7
8
Request Blocked:
{
"level": "warn",
"message": "Rate limit exceeded",
"endpoint": "authorize",
"ip": "192.168.1.100",
"count": 11,
"limit": 10,
"retryAfter": 60
}2
3
4
5
6
7
8
9
Testing
Manual Testing
Test per-IP rate limiting:
# Make 11 requests rapidly
for i in {1..11}; do
curl -i "http://localhost:8080/oauth/authorize?client_id=test&redirect_uri=http://localhost:3000/callback"
echo "Request $i"
done
# Expected: First 10 succeed (302), 11th fails (429)2
3
4
5
6
7
Test with different IPs:
# Each IP gets independent limit
for ip in "192.168.1.1" "192.168.1.2" "192.168.1.3"; do
curl -i -H "X-Forwarded-For: $ip" \
"http://localhost:8080/oauth/authorize?client_id=test&redirect_uri=http://localhost:3000/callback"
done2
3
4
5
Automated Testing
Run rate limiting tests:
# Run all OAuth authorize tests
npx vitest run src/routes/oauth-authorize.test.ts
# Run specific rate limiting tests
npx vitest run -t "rate limit"2
3
4
5
Architecture Integration
Related Systems
Rate Limiting Middleware (
src/middleware/distributed-rate-limit.ts)- Shared rate limiting implementation
- Redis-backed sliding window algorithm
- Consistent behavior across endpoints
Redis Service (
src/services/redis.ts)- Provides Redis client for rate limiting
- Handles connection management
- Implements graceful degradation
Metrics Service (
src/services/metrics.ts)- Tracks rate limiting hits and usage
- Exports Prometheus metrics
- Enables monitoring and alerting
Comparison with Other Endpoints
| Endpoint | Window | Per-IP Limit | Global Limit | Rationale |
|---|---|---|---|---|
| OAuth Authorize | 1 min | 10 | 1,000 | User-facing, login flow, DoS protection |
| MCP | 1 min | 100 | 10,000 | High-value protocol, frequent operations |
| DCR | 1 hour | 10 | 1,000 | Infrequent operation, strict abuse prevention |
Limitations
- No Exponential Backoff: Clients are not forced to increase retry intervals after repeated rejections
- IP-Based Only: Does not consider authenticated user identity (endpoint is public)
- Static Limits: Limits are not dynamically adjusted based on system load
- No Client Prioritization: All clients treated equally, no premium/trusted client tiers
Future Enhancements
- Dynamic Limits: Adjust limits based on system load and Redis capacity
- Client Allowlisting: Bypass rate limiting for trusted client IPs
- Enhanced Metrics: Add percentile tracking for request distribution
- Rate Limit Headers: Add standard rate limit headers per draft-ietf-httpapi-ratelimit-headers
- Exponential Backoff: Implement increasing penalties for repeated violations
Related Work
- Gap Analysis: Section 8.1 "OAuth Authorize Rate Limiting"
- Rate Limiting Architecture: rate-limiting.md
- OAuth Architecture: oauth.md
- Related Enhancements:
- DCR Rate Limiting - Similar pattern
- MCP Rate Limiting - Protocol protection
Implementation Date
Completed: 2026-01-06 Actual Effort: 2.5 hours Test Coverage: 2 rate limiting tests (per-IP and global limits) Validation: All tests passing (735/736, 1 skipped)
References
src/routes/oauth-authorize.ts- OAuth Authorize Route Implementationsrc/config/rate-limit.ts- Rate Limiting Configurationsrc/middleware/distributed-rate-limit.ts- Distributed Rate Limiter Middleware- RFC 6749 - OAuth 2.0 Authorization Framework