Installing nginx breaks HTTP requests for OpenCPU - r

I'm trying to install the cache server for OpenCPU (I need to enable caching) on an Ubuntu 16.04 EC2 instance. A dependency of opencpu-cache is the latest version of nginx (I can't install the cache server without it).
After I had already installed OpenCPU and verified that it was working, I installed nginx and then opencpu-cache. After installation, however, I can no longer make HTTP or HTTPS requests to the server. Entering both the public IP address and public DNS from the AWS console into my web browser fails to yield a landing page for the server, whereas it was working fine before I installed nginx.
My security rules on AWS are set up correctly (i.e. they're allowing the right ports for HTTP and HTTPS), so what is the issue? All my packages on the server are also up-to-date. SSH sessions work just fine still. I just can't figure out what the issue is.

Nevermind, it turns out that sudo service opencpu-cache restart did the trick lol. Props to Jeroen above.

Related

Running Next.js in production on HTTPS

I cannot figure out how to set my next js project to run on HTTPS.
I am using Next 12.3 and deploying on AWS EC2. I have my SSL sertificates and all ports are open. What should I do to run it like that?
P.S.
All the answers that I've found are about running on HTTPS during the development.
Some people even claimed that it is not natively supported by Next. Is this true?
If you setup nginx, this becomes extremely easy.
You can handle the SSL part in nginx and run your NextJS server normally and you will have a server running on HTTPS.
See Configuring HTTPS servers for setting up Nginx.

Using Apigility on a remote server

I have successfully installed Apigility to a remote CentOS server. It tell me to go to http://localhost:8888 to access the admin panel. This server does not have a GUI installed so I don't have the ability to remote in to use a web browser. Is there a workaround to access the Apigility interface remotely, possibly restricting access to my IP address? If not do I have to install it on my local machine and then deploy my work to the remote server?
You could add a .htaccess
file
to set a password on it
If you're deploying to AWS you should be able to configure your SecurityGroup to only allow request to your installation if you're trying to access it.
if you want to develop your application right now I would recommend to have a local installation in a docker container or so to perform your changes. If you're going live you shouldn't change anything in the admin surface either.
centOS server using terminal if i am right. The best way to do this with centOs server in terminal is to open port 8888 to the public and access the server from another system serverIP:8888 (using tools like firewalld in the centOs server. U will have to install firewalld) https://www.rootusers.com/how-to-open-a-port-in-centos-7-with-firewalld/

Odoo ERP not working on the Public IP cloud platform

I'm working on ubuntu server 15.04 on Digital Ocean and I've installed Odoo ERP from github alongside with all the required libraries, packages, postgresql ...etc and have followed all the steps in order to set it up on my DO IP. The current situation is when I access it at http://188.166.125.13:8069/ I get an internal server error. As long as I can tell from the terminal after executing the command ./openerp-server --addonspath=addons from inside the Odoo directory everything seems to be working fine, except that it seems to be reading the config locally executing 0.0.0.0 instead of my IP. Any clue about how to launch the application through the public IP and not locally?
Another question, is there a way to execute the droplet physically as backup so that I can reload it on any other server?
First please check your local IP config.
sudo vi /etc/network/interfaces
Next please check that your Apache server allows connection from outside.
First of all please make sure "Apache default page" serves to outside world. Here is guide for Laravel(apache section guide works for Odoo ERP)
https://www.dev-metal.com/install-laravel-4-ubuntu-12-04-lts/
As I checked last time you can't backup droplet from Digital ocean and reload to another server.

Meteor Vagrant can curl localhost:3000 but windows cannot open localhost:3000 windows 8.1

Good morning,
Working on installing Meteor on windows using the following guide:https://gist.github.com/gabrielhpugliese/5855677
As pointed out on other posts its a little dated and I needed to install meteor separately, which I used this guide: Unable to install meteorite on Ubuntu VM
Currently, my set up can do the following:
files stay in sync between vagrant and windows
localhost:3000/ is working on the server
What I still need help completing:
when opening localhost:3000/ in my windows browser, I get the "This webpage is not available
I know that the vagrant VM is correctly serving the app because I opened a new instance of vagrant and curled the localhoust:3000/
I am actively working in django and node and can successfully run apps locally on :8000 and :8080, I tested the meteor app on those ports but still couldn't connect. I also created a windows firewall port exception on 3000 but the results didn't change.
I know that there is a windows-preview currently out, but that is not working for me and I have an issue being tracked in gitHub.
Thank you in advance.
One thing that might be worth mentioning is it is somewhat possible to use Meteor on windows.
More details here: https://github.com/meteor/meteor/wiki/Preview-of-Meteor-on-Windows.
With your vagrant machine it sounds like there is a problem with port forwarding on your localhost machine to the VM's ports.
One possible simple way to get passed this is to get your Ubuntu machines IP address and simply load it up using http://<ip address>:3000.
I'm not sure why the port forwarding isn't working on your machine. In general the reason is provided when you run vagrant up, if there was an issue.

Is outgoing Azure traffic firewalled?

I have a Windows VM in Azure that I'm using for VS2015 experiments.
Google Drive is unable to contact update servers to finish its own installation (despite Chrome/Omaha working fine).
Apparently, I also can't clone git repos over ssh, even though HTTPS seems to be working.
Disabling the Windows Firewall does not seem to remedy these issues.
Suggestions?
You have to open endpoints on the vm. HTTPs and HTTP are open. If you need other ports...then you need to find them and open them as endpoints. My guess is that's what is happening here.

Resources