1. Simple Meaning
A Load Balancer is like a traffic police 🚦 for your servers.
It stands in front of multiple backend servers and splits incoming requests among them so that no single server gets overloaded.
2. Real-Life Example
Imagine you have a restaurant chain with 3 kitchens 🍽️:
-
Kitchen 1
-
Kitchen 2
-
Kitchen 3
If all customers go to Kitchen 1, it will be crowded and slow.
Instead, a receptionist (load balancer) sends:
-
First customer → Kitchen 1
-
Second customer → Kitchen 2
-
Third customer → Kitchen 3
-
Fourth customer → Kitchen 1 again (and so on…)
This way:
-
All kitchens work equally
-
Customers get food faster
-
No single kitchen is overloaded
3. Why Load Balancing is Needed
Without load balancing:
-
One server can crash from too much traffic.
-
Other servers remain idle.
-
Users get slow responses or timeouts.
With load balancing:
-
Traffic is distributed evenly.
-
If one server fails, others take over.
-
Faster response time.
4. Nginx as a Load Balancer
Nginx can sit in front of multiple servers and send requests to them based on different strategies.
4.1 Basic Load Balancing Example
You have 3 Node.js servers:
-
server1.example.com
-
server2.example.com
-
server3.example.com
Nginx config:
# Define backend servers
upstream backend_servers {
server server1.example.com;
server server2.example.com;
server server3.example.com;
}
# Use them in a reverse proxy
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://backend_servers;
}
}
Here, Nginx sends requests in Round Robin style:
1st request → server1
2nd request → server2
3rd request → server3
4th request → server1 again…
5. Load Balancing Methods in Nginx
Nginx supports multiple strategies:
-
Round Robin (Default) → Sends requests one by one to each server.
-
Least Connections → Sends request to the server with the fewest active connections.
upstream backend_servers { least_conn; server server1.example.com; server server2.example.com; }
-
IP Hash → Same client always goes to the same server.
upstream backend_servers { ip_hash; server server1.example.com; server server2.example.com; }
-
Weighted Round Robin → Give some servers more traffic if they are more powerful.
upstream backend_servers { server server1.example.com weight=3; server server2.example.com weight=1; }
6. Extra Benefits of Load Balancing
-
High Availability → If one server is down, traffic is sent to others.
-
Scalability → Add more servers as traffic grows.
-
Failover → Automatic backup servers.
7. Example: API with Load Balancer
You have:
-
API Server 1 →
localhost:3001
-
API Server 2 →
localhost:3002
Nginx Config:
upstream api_servers {
server localhost:3001;
server localhost:3002;
}
server {
listen 80;
server_name api.example.com;
location / {
proxy_pass http://api_servers;
}
}
✅ Traffic gets distributed between the two API servers.
8. Simple Flow
Client → Nginx Load Balancer → Multiple Backend Servers → Response
No comments:
Post a Comment