Load Balancer

Improve the performance of your services as you grow.

Increased availability

Load Balancer is the easiest way to build a resilient platform thanks to a highly available architecture. All backend servers are monitored to ensure that traffic is distributed among healthy resources.

Peak load management

Avoid dips in performance by adding as many backend servers as necessary. Increase your processing capacity in a few clicks from the Scaleway console or configure automatic scaling using the API.

Scale your business

As your business grows, you need more resources to succeed. With Load Balancer, you can easily scale your business by adding new backend servers to improve your quality of service without any downtime.

Available zones:
Paris:PAR 1PAR 2
Amsterdam:AMS 1AMS 2AMS 3
Warsaw:WAW 1WAW 2WAW 3

Load Balancer use cases

Handle peaks in e-commerce activity

Distribute workloads across multiple servers during peaks in traffic to your website using Load Balancer to ensure continued availability and avoid servers being overloaded.

Under the hood

  • BandwidthUp to 4Gbps

  • Multi-cloud compatibilityOn LB-GP-L & LB-GP-XL offers

  • Health checksHTTP(S), MySQL, PgSQL, LDAP, REDIS, TCP

  • Balancing algorithmsRound-robin, sticky, least connexion, first healthy

  • Backend serversUnlimited - Public or within a Private Network

  • RedundancyHigh availability

  • Traffic EncryptionSSL/TLS Bridging, Passthrough, and Offloading

  • HTTPSLet’s Encrypt & custom SSL certificates

  • Key features

    Health checks

    Whether you are distributing your workload between web servers, databases, or other TCP services, you can easily set up health checks to ensure the availability of your backend servers. You can even monitor their availability in real time. If one of them fails to respond, its traffic is automatically redirected until the problem is solved.

    Balancing rules/Proxy

    Use our Load Balancer to regulate traffic according to your use cases. Round-robin, sticky connections, least connection or first healthy rules are good examples of what is possible.

    Unlimited backends

    Add as many backend servers as you want to our Load Balancer and scale your infrastructure on the fly, without any limits, and distribute your traffic across multiple platforms with the multi-cloud offer or inside a VPC on a Private Network.

    ACLs permissions

    Filter the IP addresses that are allowed to request your servers. Disable unwanted visitors to keep them from connecting to your network bandwidth, thus increasing security.

    Multi-cloud connections

    Some offers allow you to distribute your traffic between different platforms or any on-premise server or Instance. This allows you to build a more robust infrastructure and avoids depending on a single platform.


    Use our Load Balancer to expose your containers and pods to the internet, so they have a common DNS and IP address, and to balance workloads.

    Bandwidth of up to 4Gbit/s

    With a sizable bandwidth offer, there’s no use case we don’t support. And, as we do not charge for egress, you will be billed a fixed price with no surprises.

    Adapt the protocol

    You can configure your Load Balancer’s backend, and choose the protocol (HTTP, HTTPs, HTTP/2, HTTP/3 or TCP) used to send and receive data.

    Traffic encryption

    Improve network speed by passing SSL/TLS-encrypted data through your Load Balancer without decrypting it, accelerating backend request processing as well as communication between servers and end users.

    You can also use Load Balancer as a bridge to decrypt incoming encrypted traffic at the frontend and re-encrypt traffic before forwarding it to backend servers, thus ensuring total end-to-end security.


    No egress fees

    Backend servers
    200 Mbit/s
    500 Mbit/s
    1 Gbit/s
    4 Gbit/s

    Start in minutes

    Get started with tutorials

    • First steps with Scaleway Load BalancerLearn more
    • Setting up a Load Balancer for WordpressLearn more
    • How to use the Proxy protocol v2 with Load BalancerLearn more

    Frequently asked questions

    Load Balancers are highly available and fully managed Instances which distribute workloads among your servers. They ensure application scaling while delivering their continuous availability. They are commonly used to improve the performance and reliability of websites, applications, databases, and other services by distributing workloads across multiple servers.

    It monitors the availability of your backend servers, detects if a server fails and rebalances the load between the rest of the servers, making your applications highly available for users.

    The multi cloud is an environment in which multiple cloud providers are used simultaneously. With our multi-cloud offers, you can add multiple backend servers besides Instances, Elastic Metal and Dedibox servers.
    These can be services from other cloud platforms such as Amazon Web Services, Digital Ocean, Google Cloud, Microsoft Azure or OVHcloud, but also on-premise servers hosted in a third-party datacenter.

    All protocols based on TCP are supported. It includes database, HTTP, LDAP, IMAP and so on. You can also specify HTTP to benefit from support and features that are exclusive to this protocol.

    Yes, you can restrict the use of a TCP port or HTTP URL via ACLS. Find more information here.

    Each Load Balancer provides external connectivity via an IPv4 address. IPv6 is not yet supported for external connections, but it can be used to communicate between the Load Balancer and your backend servers.

    No, it’s not required. You can use private Scaleway IPs on your backend servers if they are hosted in the same Availability Zone (AZ) as the Load Balancer.