Reserve the NVIDIA GH200 Grace Hopper™ Superchip at Vultr today.
The Vultr Load Balancer is a fully-managed solution to distribute traffic to multiple application servers. With a Vultr Load Balancer, you can enable horizontal scaling and increase the reliability of your applications in seconds by setting a few parameters through the customer portal. You don't need to worry about the underlying load balancer server operating system, configuration files, or system management tasks. We handle all the details so you can focus on your application.
Assume you have an e-commerce store with a single web server.
As your store becomes more popular, you need to scale up your site to manage the traffic. You could use a more powerful web server, but a better solution might be to use multiple servers and load balancer tools. If you add two more servers and a Vultr Load Balancer, your network looks like this.
Using a load balancer has several advantages.
You can scale your application up and deploy more web servers as the traffic grows.
As necessary, you can scale your application down by removing web servers if the traffic drops.
The Load Balancer detects failed web servers and stops routing traffic to them, improving your application's availability.
Vultr Load Balancers support custom health checks, multiple load balancing algorithms, sticky sessions, proxy protocol, SSL certificates, firewalls, private networks, and more. Vultr Load Balancers work with all our server products, including Bare Metal.
Load balancers are effective for applications that can scale with multiple parallel instances. They distribute the load but don't address file synchronization or database consistency between your application instances.
To deploy a new Vultr Load Balancer, navigate to the Add Load Balancer page in the customer portal.
Choose a location. Your load balancer and all instances attached to that load balancer must be in the same location.
Choose a Load Balancer Configuration.
Enter a label of your choice for this load balancer.
Choose an algorithm. The default, Roundrobin, selects servers in turn without regard for traffic. Leastconn selects the server with the least number of connections.
If you redirect all traffic from HTTP to HTTPS, you must use an HTTPS rule and SSL certificate.
If you enable Proxy protocol, you must also configure your backend nodes to accept Proxy protocol.
Enter the number of nodes for this load balancer.
Create at least one forwarding rule. Do not use port 22 or 65300-65310 because the Load Balancer uses these internally.
HTTP2 defaults to Off. To enable it, you must add at least 1 HTTPS forwarding rule combo (HTTPS -> HTTPS).
VPC Network defaults to Public. If you prefer sending traffic to your instances via their attached VPC, choose that here.
Firewall rules are optional.
Health checks allow the load balancer to determine if an instance is ready to receive traffic.
When you have completed the form, click the Add Load Balancer button to deploy.
After the load balancer deploys, navigate to the Load Balancer section, click the three-dot More menu, and click Manage.
On the Manage Load Balancer page, click the Add Instance button, then select an available instance from your location.
The Vultr Load Balancer has an integrated firewall. You can learn more in our article How to Use the Vultr Load Balancer Firewall.
The Vultr Firewall can use a Load Balancer as an IP source. We explain more in How to Use the Vultr Firewall with a Vultr Load Balancer.
Explore an advanced scenario with private networking and multiple firewalls in How to Configure a Vultr Load Balancer with Private Networking, where you'll create an advanced configuration like this.
This guide is an overview of Load Balancer concepts. If you need more information about a specific feature, the Vultr Load Balancer Feature Reference has detailed configuration information.
Nodes allow you to scale your load balancer to handle more traffic. With more nodes, you can handle more concurrent connections and requests per second.
Load balancers support up to 15,000 simultaneous connections per node.
We allow up to 99 nodes per load balancer. We only allow odd numbers of nodes in a load balancer. This allows for automatic failover in the event of a node failure.
No. A load balancer can only direct traffic for server instances in the same location as the load balancer server itself.
Unfortunately not. Load Balancers and attached instances must be in the same location.
If using HTTP or HTTPS protocol, ensure the port and URL paths are correct. The health check looks for HTTP 200 OK success status response code. Any other code is considered unhealthy.
If using TCP protocol, test an open port on the attached node.
Vultr Load Balancers are bandwidth neutral. We only charge for bandwidth on the instances attached to the load balancer.
You can assign and remove instances to your Load Balancer in the Vultr customer portal.
You do not have to worry about managing Vultr Load Balancer software. They are fully managed.
Vultr Load Balancers support TCP, HTTP, and HTTPS.