API Gateway vs Load Balancer vs Reverse Proxy: What's the Difference?
Abstract AlgorithmsTL;DR
TLDR: These three terms are often used interchangeably because they overlap. A Reverse Proxy hides the server. A Load Balancer distributes traffic. An API Gateway manages APIs (Auth, Rate Limiting). Think of them as a hierarchy: An API Gateway is a L...

TLDR: These three terms are often used interchangeably because they overlap. A Reverse Proxy hides the server. A Load Balancer distributes traffic. An API Gateway manages APIs (Auth, Rate Limiting). Think of them as a hierarchy: An API Gateway is a Load Balancer, which is a Reverse Proxy, but with superpowers.
1. The "No-Jargon" Analogy
Imagine a High-End Hotel.
- Reverse Proxy (The Bodyguard): You want to talk to the CEO staying in the penthouse. You don't go to his room. You talk to the Bodyguard in the lobby. He takes your message up. You never see the CEO's room number.
- Load Balancer (The Reception Desk): There are 5 check-in counters. The manager stands at the front and points you to "Counter 3" because it's empty. He ensures no single clerk is overwhelmed.
- API Gateway (The Concierge): You walk in. He checks your ID (Auth). He sees you are a VIP (Rate Limiting). He translates your request ("I want food") into a specific order for the kitchen (Routing & Transformation).
2. Level 1: The Reverse Proxy
This is the foundational concept.
- Definition: A server that sits in front of one or more web servers, intercepting requests from clients.
- Key Job: Anonymity & Security. The client never knows the IP address of the actual application server.
- Features:
- SSL Termination: Handles HTTPS encryption/decryption so the app server doesn't have to.
- Caching: Stores static files (images, CSS) to serve them faster.
- Compression: Gzips responses.
- Example Tool: Nginx, Apache.
graph LR
Client -->|Request| Proxy[Reverse Proxy]
Proxy -->|Forward| Server[App Server (Hidden IP)]
Server -->|Response| Proxy
Proxy -->|Response| Client
3. Level 2: The Load Balancer
A Load Balancer is a specific type of Reverse Proxy focused on Availability and Scale.
- Definition: A device that distributes network traffic across a cluster of servers.
- Key Job: Distribution. Ensures no single server crashes under load.
- Features:
- Health Checks: Pings servers ("Are you alive?"). If Server A is dead, it stops sending traffic there.
- Algorithms:
- Round Robin: 1, 2, 3, 1, 2, 3...
- Least Connections: Send to the server with the fewest active users.
- L4 vs L7:
- L4 (Transport): Routes based on IP/Port (Fast, dumb).
- L7 (Application): Routes based on URL/Headers (Smarter, slower).
- Example Tool: AWS ALB, HAProxy.
graph LR
Client --> LB[Load Balancer]
LB --> Server1
LB --> Server2
LB --> Server3
style Server2 fill:#ffcccc,stroke:#333,stroke-width:2px,stroke-dasharray: 5 5
note right of Server2: Dead (Health Check Failed)
4. Level 3: The API Gateway
An API Gateway is a Load Balancer that understands Business Logic. It is designed specifically for Microservices.
- Definition: A single entry point for all API calls that handles cross-cutting concerns.
- Key Job: Management. It unifies 50 microservices into one clean API for the frontend. (See our post on Backend for Frontend for a related pattern).
- Features:
- Authentication/Authorization: Checks JWT tokens. "Is this user allowed to see this?"
- Rate Limiting: "User X can only make 100 requests per minute."
- Routing:
/api/users-> User Service./api/cart-> Cart Service. - Protocol Translation: Converts REST (Frontend) to gRPC (Backend).
- Example Tool: Kong, Zuul, Spring Cloud Gateway, AWS API Gateway.
Deep Dive: The Network Order (Who Comes First?)
In a complex enterprise system, you often use all three. But in what order?
The Standard Enterprise Flow:
- Client (Internet)
- Public Load Balancer (L4/L7): The entry point. Handles SSL termination and DDoS protection.
- Reverse Proxy (Nginx/Ingress): Often sits here to route traffic.
- Static Request (
/index.html): Serves directly from cache/disk. - Dynamic Request (
/api/...): Forwards to the API Gateway.
- Static Request (
- API Gateway: Handles Auth, Rate Limiting, and Business Routing.
- Internal Load Balancer: Distributes traffic among the instances of that specific service.
- Microservice: The actual code.
graph TD
Client[Client App] -->|HTTPS| PublicLB[Public Load Balancer (AWS ALB)]
PublicLB -->|HTTP| Nginx[Reverse Proxy (Nginx)]
Nginx -->|/static| CDN[Static Files]
Nginx -->|/api| Gateway[API Gateway (Kong)]
Gateway -->|/users| UserLB[Internal LB]
Gateway -->|/orders| OrderLB[Internal LB]
UserLB --> User1[User Service 1]
UserLB --> User2[User Service 2]
OrderLB --> Order1[Order Service 1]
OrderLB --> Order2[Order Service 2]
Why this order?
- Public LB first: It's hardware/cloud-optimized to handle massive traffic spikes.
- Reverse Proxy second: It offloads static content so the expensive API Gateway doesn't waste CPU on images.
- Gateway third: It secures the actual business logic.
Real-World Scenario: Web Request vs. API Request
How does the flow differ for a static website vs. a dynamic API?
Scenario A: Loading the Homepage (Web Request)
- User: Types
www.myshop.com. - Flow:
- Public Load Balancer: Receives request.
- Reverse Proxy (Nginx): Sees the request is for
index.html(static file). - Action: Serves the file directly from its cache or a CDN.
- Result: The request never touches the API Gateway or Microservices. It's fast and cheap.
Scenario B: Clicking "Checkout" (API Request)
- User: Clicks "Buy Now".
- Flow:
- Public Load Balancer: Receives request.
- Reverse Proxy: Sees the request is for
/api/checkout. Forwards it. - API Gateway:
- Checks Auth Token: "Is User logged in?" (Yes).
- Checks Rate Limit: "Is User spamming?" (No).
- Routes to
Order Service.
- Internal Load Balancer: Picks
Order Service Instance #4. - Microservice: Processes the payment.
Deep Dive: When to Use What?
The lines are blurry. Nginx can do all three. So why buy specialized tools?
| Feature | Reverse Proxy (Nginx) | Load Balancer (AWS ALB) | API Gateway (Kong) |
| Hides Server IP | ✅ Yes | ✅ Yes | ✅ Yes |
| Distributes Traffic | ⚠️ Basic | ✅ Advanced | ✅ Yes |
| Health Checks | ⚠️ Basic | ✅ Advanced | ✅ Yes |
| Auth / Rate Limit | ❌ No (Requires scripts) | ❌ No | ✅ Native / Plugins |
| Analytics | ❌ Logs only | ⚠️ Basic Metrics | ✅ Detailed API Usage |
| Cost | Free | $$ | $$$ |
The Evolution Path:
- Startup: Just use Nginx (Reverse Proxy) on a single server.
- Growth: Add a second server. Use AWS ALB (Load Balancer) to split traffic.
- Scale: You have 20 microservices. Authentication logic is duplicated everywhere. Introduce an API Gateway to centralize Auth and Rate Limiting.
Summary & Key Takeaways
- Reverse Proxy: Protects the server. (Security).
- Load Balancer: Protects the system from crashing. (Availability).
- API Gateway: Protects the business logic and simplifies the client experience. (Management).
- Rule of Thumb: Don't use an API Gateway if you only have one monolithic server. It adds latency and complexity.
Practice Quiz: Test Your Design Skills
Scenario: You have a monolithic application running on 3 servers. You need to distribute traffic evenly. You do not need authentication or rate limiting at the edge. What should you use?
- A) API Gateway
- B) Load Balancer (L4/L7)
- C) Simple Reverse Proxy
Scenario: Your microservices use gRPC for speed, but your React frontend only understands JSON/REST. Where should the translation happen?
- A) In the React App.
- B) In the Database.
- C) In the API Gateway.
Scenario: Why might you put a Load Balancer behind an API Gateway?
- A) You wouldn't; that's redundant.
- B) The API Gateway handles the "Business Routing" (User Service vs Cart Service), while the Load Balancer handles the "Instance Distribution" (User Service Instance 1 vs Instance 2).
- C) To double the security.
(Answers: 1-B, 2-C, 3-B)

Written by
Abstract Algorithms
@abstractalgorithms
