10.3. Using Load Balancer to Expose Service Running on Multiple Virtual Machines¶
A load balancer is a particular virtual instance with configured HAproxy service that redirects specific traffic to the members’ group.
So, if a floating IP is a way to expose one VM to the public, a load balancer is the tool to expose a service running on multiple virtual machines. If you want to expose an HTTP service running on port 8080 in two VMs into one public IP and port 80, do the following:
On the Load balancers screen, click + Create load balancer.
In the Create load balancer window, do the following:
Specify a name and, optionally, a description.
High availability means using two instances of load balancers in the active-backup mode. If high availability is disabled, a single load balancer will be secured with the default platform high availability mode when a VM gets restarted on a new HW node in case of HW failure on the initial node.
In the Network settings section, select the network in which you have your service’s VMs.
Select the Use a floating IP address checkbox if you need to expose the service to the public, and then choose to use an available floating IP address or create a new one.
In the Balancing pools section, click Add to create a balancing pool to forward traffic from the load balancer to virtual machines.
In the Create balancing pools window that opens, do the following:
In the Forwarding rule section:
Select the protocol which is your service networking protocol, such as HTTP/HTTPS, TCP, or UDP.
Specify the LB port a front-facing port that you will use to connect from outside.
Enter the back-end port, a service port on your virtual machines.
In the Balancing settings section, select the balancing algorithm that determines how data flow will be balanced between the back-end virtual machines:
Source IP algorithm. It will guarantee that an external client (if its IP does not change) will be directed to the same back-end host.
Round-robin. It will direct each packet or session (for session-level protocols) to different back-end hosts.
Turn on the Sticky session toggle to balance the session’s level protocols, such as HTTP/HTTPS, to send the packets of the same session to the same back-end host.
Once created, a load balancer exposes your service to the public.