Appendix C: Nginx load balancer example
Each worker node handles all application requests including application authentication. If the cluster is behind a firewall, or for a more seamless application access experience, an extra load balancer can be deployed. There are many ways to do this, the following is a very simple example.
Create Nginx configuration file
Create a basic Nginx configuration file for a three workers Kubernetes cluster running ICE Server. Add your own customizations as needed per Nginx documentation.
error_log stderr notice;
worker_processes auto;
worker_rlimit_nofile 130048;
worker_shutdown_timeout 10s;
events {
multi_accept on;
use epoll;
worker_connections 16384;
}
stream {
upstream worker_80 {
least_conn;
server IP___OF___WORKER___ONE:80;
server IP___OF___WORKER___TWO:80;
server IP___OF___WORKER___THREE:80;
}
upstream worker_443 {
least_conn;
server IP___OF___WORKER___ONE:443;
server IP___OF___WORKER___TWO:443;
server IP___OF___WORKER___THREE:443;
}
upstream worker_7443 {
least_conn;
server IP___OF___WORKER___ONE:7443;
server IP___OF___WORKER___TWO:7443;
server IP___OF___WORKER___THREE:7443;
}
upstream worker_8443 {
least_conn;
server IP___OF___WORKER___ONE:8443;
server IP___OF___WORKER___TWO:8443;
server IP___OF___WORKER___THREE:8443;
}
server {
listen 80;
proxy_pass worker_80;
proxy_protocol on;
}
server {
listen 443;
proxy_pass worker_443;
proxy_protocol on;
}
server {
listen 7443;
proxy_pass worker_7443;
}
}