HAProxy configuration for load balacing - http

I trying to configure a Haproxy load balancer in layer 7. It worked, but after reinstalled HAProxy the backend servers failed the health check. Haproxy says they are down. How do I fix it?
I got this:
I reinstalled haproxy and always the same thing.

Related

Hosting fastAPI on vast.ai GPU instance

How to allow http traffic on vast.ai instance? I'd like to host GPU related code using fastAPI+nginx, but I am not seeing NGINX homepage after configuration. I am not seeing bad gateway error. What I am getting is "This site can’t be reached".
After configuration, I wanted to see Nginx homepage. It works on AWS, but when setting the instance on AWS, you get the option of choosing "Allow http/https traffic". On vast.ai, I do not see that.
Ok, so it didn't work by ssh-ing into the instance and running the fastAPI.
I rented another instance with Jupyter Notebook enabled.
So Jupyter + ngrok + uvicorn works. Since vast.ai instance IP isn't accessible, ngrok does the trick by providing a unique ip

Having Issue with Cloudflare and Nginx proxy manager where it does not connect to my local server. (Error 522/523)

I'm trying for the first time to connect my local server (Synology) through NGINX and Cloudflare so I can access it through my own domain name. I have the proxy host all set up pointing to my local IP address with the port and I have an SSL encryption using Let's Encrypt. The site gives me either a timed out error or unreachable, however one time somehow the site took my to ASUS aicloud which is through my ASUS ac68u router but I was not even pointing NGINX to that.
using cloudflare diagnostic center site it syas the request failed because the web server did not respond.
I'm not sure whether my router is blocking Cloudflare or if there is any other issue going on, would appreciate any help with the matter!

Where should SSL be installed

I have got a setup like this
Load balancer
Machine 1 - haproxy load balancer
Machine 2 - haproxy load balancer
Web servers
Machine 1 - nginx with app
Machine 2 - nginx with app
Now where should I set up SSL certificate. On loadbalancers or web servers or on both?
What is the correct way of doing it?
The "correct way" to do this depends on your setup. If your load balancers are on the same machines as your webservers, it doesn't matter which you choose to put the cert on. If they are on different servers, encryption depends on how important security is for these particular web apps. If you put the certs on the load balancers you will have unencrypted traffic visible to anyone in your network (as it goes from load balancer to server). If you put certs on your nginx server you will have encryption all the way through to the local server, but you will have to change your haproxy a little to have it route encrypted traffic properly. You also will not be able to route off the url path. You can also put certs on both to be able to route off the url path, but that is a little more to manage (two certs vs one). Overall it's probably best to put the cert on nginx server, assuming your don't need to do any routing in the load balancer off of the url. Also definitely do your own research.

Setting up SSL on AWS EC2

I'm trying to set up SSL on my wordpress site.
I've an EC2 instance running wordpress on nginx and ubuntu. Database running on RDS.
I've launched an application load balancer with listeners on ports 80 and 443 and attached the SSL certificate which I got via ACM. I've set my targets to point to the EC2 instance I am using.
At this point the how-to guides and information stops. Apparently that's all there is to it and it should now all be working. However it's not. I'm getting connection refused errors when I add the https to my site's URL.
When I put my URL into https://www.sslchecker.com/sslchecker I'm told that no certificates are found.
So clearly I need to something more to get this working - can anyone point me to the next step?
Using the ELB and ACB is the way to go here. It sounds like you might be using the wrong type of ELB though. You mentioned application load balancer, you should use a classic load balancer. Also make sure your security groups are setup correctly to allow your ELB to talk to the EC2 instance.
You didn't mention Route53 but I assume you have the DNS entry setup to point at the ELB as well.
Share more and I will help more. Good luck.

GitLab not working w/ Nginx

I am trying to get GitLab setup with my current installation of Nginx but I keep getting an Error 502. I have included my configuration files, and not sure what I am doing wrong. But I followed the "Using a non-bundled web-server" steps on https://gitlab.com/gitlab-org/omnibus-gitlab/blob/master/doc/settings/nginx.md
/etc/nginx/conf.d/gitlab-omnibus-nginx.conf
http://pastebin.com/bQ8eCiNh
/etc/gitlab/gitlab.rb
http://pastebin.com/Lw5tjwXy
HTTP 502 means "The server was acting as a gateway or proxy and received an invalid response from the upstream server." So there are two possibilities here.
Your Gitlab server is not actually working or is returning an invalid response. After starting the Gitlab server, use sudo netstat -plnt and make sure it is running on a port and note the port. Then connect directly to this port in your browser (or from the CLI on the server if necessary) and confirm that Gitlab is working fine without a proxy in front of it. If Gitlab is running on a socket and not a port, there are also tools to test HTTP servers through socket connections that you can use.
Nginx is not configured correctly to connect to Gitlab. In this case, check your Nginx error log to see if there is any more detail besides the "502" error.

Resources