Google Cloud VM Instance with Wordpress Repeatedly Crashes [closed] - wordpress

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I've got a newly deployed Google Cloud f1-micro VM Instance using the Google Click to Deploy Wordpress solution. I've done some basic configurations such as a static IP, DNS config, persistent disk, etc. I also imported data from a previous Wordpress install.
The issue is that about every 10-15 minutes the server crashes and becomes unresponsive. Error 500 when trying to view any page, and SSH becomes unreachable. The console shows the VM Instance is still running though. I have to Reset the VM Instance, and then everything is restored within a few seconds. CPU usage is almost always below 50%. The site currently has almost zero traffic, just me testing it.
Any idea what may be causing it to freeze/crash so often? I know the f1-micro is not a powerful server, but I've read a lot about running Wordpress on it and it seems like it should be able to handle a low traffic website.

You can see system log from console in Web UI
Or you can use CLI
gcloud compute --project={your project id} instances get-serial-port-output {vm name}
f1-micro only have 0.2core and 0.60 memory. it can not support both wordpress and mysql. I think it out of memory, and system keep kill your process (mysql or wordpress). I suggestion you upgrade to bigger vm.

Related

How to publish website but prevent visitors? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I would like to publish my asp.net core to my own domain that is hosted on a shared hosting.
However, after publishing I would like to spend a few hours live testing it (yeah, I have yet to learn auto testing using selenium).
While doing so, I want to prevent visitors from knowing that the site is published so that they won't use it; some strangers know that I will be publishing some time soon and they may have set up auto monitoring.
How can that be achieved?
Best recommendation would be to use a different subdomain if your applications isn't affected by the domain it is accessed on, this will allow the older version to still be used by others. But if you don't mind if others are completely unable to access the system while you are testing, just whitelist only your IP on the domain.
You can setup a gateway for you website.
You can set IP whitelist in the server hosting provider, or you can use a reverse proxy like Nginx to set IP whitelist or HTTP auth before the requests arrive your website.
If the domain doesn't matters, you can use subdomain as Jacob suggests

Is R shiny secure by doing so? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I want to create a Shiny app that could share within my company. Though my company laptop don't allow the Shinyapps.io, I used my personal laptop to successfully run a shiny test.
My question is, is there any potential security issue if I use my personal laptop to run the company data using shiny, and share the output to my coworkers.
If the security is violated by doing so, what are other options to make it secure only to the company?
Anything hosted on shiny server will be available to anyone who is able to connect to your laptop. If you want to work around this, you can use NGINX and require people to authenticate on another page before they can gain access to shiny, which you host locally and connect through a websocket. However, you're likely to mess up on some security thing (because it's incredibly easy to get security wrong), and people will gain access to either the raw data (which is extremely terrifying) or whatever visualizations you create in shiny (which is still terrifying). If you just use runApp("my-app"), and then screenshot visualizations from your computer to send around, you're fine, but if the data goes anywhere on the internet, I wouldn't consider it safe.
To piggyback off of #ConCave on the issues of data privacy - the main issue of sharing your Shiny apps over shinyapps.io is that the data will need to be hosted on those external servers. If your company happens to have an IT support system that can recreate/host the entire Shiny server on their own servers, you could port your apps onto their server.

AWS EC2 Rstudio Server Error Occured During Transmission [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 7 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
After over a month, I have managed to piece together how to setup an AWS EC2 server. It has been very hard to upload files as there are very conservative (size) limits when done via the upload button in Rstudio Server. The error message when this is attempted is "Unexpected empty response from server".
I am not unique in this respect e.g. Trouble Uploading Large Files to RStudio using Louis Aslett's AMI on EC2
I have managed to use the following commands through putty and this has allowed me to upload files via either filezilla or winscp.
sudo chown -R ubuntu /home/rstudio
sudo chmod -R 755 /home/rstudio
Once I use these commands and log out, I can no longer access rstudio on the instances in future logins. I can relogin to my instances via my browser, but I get the error message:
Error Occurred During Transmission
Everything is fine other than once I use Putty I lose browser access to my instances.
I think this is because the command is change of ownership or similar. Should I be using a different command?
If I don't use a command I cannot connect between filezilla/winscp and the instance.
If anyone is thinking of posting a comment that this should be closed as it is a hardware issue, I don't have a problem with hardware. I am interested in the correct coded commands.
Thank you :)
Ok so eventually I realised what was going on here. The default home directory size for AWS is less than 8-10GB regardless of the size of your instance. As this as trying to upload to home then there was not enough room. An experienced linux user would not have fallen into this trap, but hopefully any other windows users new to this who come across this problem will see this. If you upload into a different drive on the instance then this can be solved. As the Louis Aslett Rstudio AMI is based in this 8-10GB space then you will have to set your working directory outside this, the home directory. Not intuitively apparent from Rstudio server interface. Whilst this is an advanced forum and this is a rookie error I am hoping no one deletes this question as I spent months on this and I think someone else will too.
Don't change the rights of /home/rstudio unless you know what you are doing, this may cause unexpected issues (and it actually does cause issues in your case). Instead, copy the files with filezilla or winscp to a temporary file (let say /tmp), then ssh to your instance with putty and move the file to the rstudio directory with sudo (e.g sudo mv /tmp/myfile /home/rstudio).

using Amazon-EC2 or Linode for hosting a debian server with R regularly extracting information from the web [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have hacked my NAS-01g into a debian server and use it to regularly download stock quote and earthquake information from the web. I was on a trip last week and I turned off my sever at home, but when I come back, I can no long gain access to the server. Considering re-configuring the server as very time-consuming, I am thinking of migrating my existing server to cloud.
I have a few requirements here:
server on 24/7
use cron to hourly call R to extract data from somewhere, say yahoo finance
(optional) backup and encrypt my gmail account
(optional) host django server and I am learning to use it now
I am thinking of using amazon-EC2 or Linode. I have tried amazon-EC2 a bit, but the pricing scheme seems very complicated for me, and I want the server to be as cheap as possible as it is not really that mission critical work. I wonder Linode is simpler for a non-system admin like me.
Hope my question won't be considered as off-topic here.
Thanks in advance.
Belongs on ServerFault, but I think Linode is a good candidate. They have monthly plans that don't cost a fortune and you can get a debian install on it. (we just installed one for work and selected Debian, so I'm 100% sure).
Pricing scheme for Amazon is more complicated, but more transparent.
They separate in Machine Hours (EC2 Hours), Data Transfer In/Out and storage.
You can have too Elastic IPs (static IPs) and Load Balancers (LBs).
And you can choose if you want reserved instances (1yr/3yr term) or on-demand instances(more expensive but able to terminate instances whenever you want).
They have their own monthly calculator, you can check it here: http://calculator.s3.amazonaws.com/calc5.html

Is possible to get notification by e-mail when a site or app pool is down (using IIS7) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I am trying to find out if I can get notified when a site is down, or when a service running under WAS is no longer running.
I don't want to code a monitoring tool, I am sure there must be something out there...
I'm using this
http://tools.pingdom.com/
Beside the cool online tools you have a subscription for monitoring your site.
I found a couple more, haven't used them. These seem totally free while pingdom tools is only free for one site.
http://www.uptimerobot.com
http://ezinedesigner.com
I also had this need, so I created an open source app called Pinger. You can do unlimited URLs with intervals of your choosing. The docs has instructions for getting running on Heroku quickly:
https://github.com/austinthecoder/pinger
I personally use Content Site Monitor. It has a really simple and cool web interface that allows you to view your site’s up-time statistics on a desktop or mobile screen. It’s easy to configure your monitoring parameters as well.
It doesn't just ping your server to make sure that it's alive. It allows you to specify certain content/keywords that you want to monitor. It will send you alert email if the content/keywords are missing from your site or if your site goes down.
Best of all, it’s free to monitor up to 3 sites!

Resources