Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
What do you use to monitor the uptime / performance of your websites, specifically those based on a PHP/MySQL platform like Wordpress?
I'm looking for something that alerts me if the site is down, or performing too slowly, and has some useful (not volumeous!) charts showing me any potential problems, and what to do about them.
Thanks!
We along with the usual Nagios, we use Pingdom. It comes with lots of default checks.
For example it also reports how fast your website is or since they employ tests from different locations you get a nice graph how accessible your website was. To put some sense into it, add a reference check (e.g. google) and see how you perform.
Aside from HTTP etc. you can also check other services (mail, database, etc.). If they are not reachable from the outside, you can always create a script that outputs a standard "OK" and have Pingdom check on that, and report back if the output changed.
I should add that Pingdom is not a free service. But we've been using them for 10 months now and they haven't troubled us. :)
Try looking at Zabbix
http://www.zabbix.com/
WEB performance monitoring
WEB availability monitoring
Support of POST and GET methods
Try out Insping also
performance monitoring
availability monitoring
e-mail and SMS alerts
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
I have a project in which I have implemented hundreds of scrapy spiders.
Now I collided with the following problems:
time to time websites changes their DOM/API - so spider stop working or not all info collected
websites became unavailable or moved to another domain - spider stop working.
Since there are a lot of spiders it not easy to monitor the status of each spider.
is there any framework that provides the ability to monitor scrapy spiders?
status of running spider(s)
show when spider(s) stop working etc.
I have looked into scrapinghub/zyte but not sure if it good for our purpose because we need something that can be running locally.
Scrapydweb is basically just like Scrapinghub but you run it locally.
It can alert you when spiders fail with email/slack if I remember correct. It's a bit less user-friendly than scrapinghub since you have to manage servers and so on. But overall I think it was a good platform when I used it.
Airflow is a very handy platform for scheduling and monitoring.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
We have subscribed for CDN (content delivery network) for our website.
Is there any way to (any online tool to display) see the server locations available.
Is there any way to check if this is working fine?
I used http://web-sniffer.net/
Look at the Response header for Server. Previously mine was Apache/2.2.22, after setting up cloudflare it was still Apache/2.2.22, but using web sniffer it showed it as cloudflare-nginx.
previously I'd been looking at response headers in chrome dev tools, but web sniffer gave different server response headers ?!
You can use a tool like this https://is-it.online/ to measure page/resource load times at various places on the globe. If the CDN setup up is really working it should consistently show similar response times from all the locations, if not it will be lesser at the location closer to where the site is physically hosted and will increase as you go physically away from it.
Full guide is here : https://medium.com/#abhijeeta/how-to-check-if-cdn-is-working-for-my-website-9fa96cd6ba32
If you want to check if the CDN has improved the page loading speed for your visitors worldwide, you can use performance testing tools like pingdom or gtmetrix and select different locations to test the load time. However, for each location, you will need to run the test twice as during the first run the CDN is expected to take longer than the usual because it has to fetch those files at that geography first before delivery.
If you want to check the number of locations of a CDN, you can get an idea of it with tools like CDN Latency Benchmark. This gives you the latency of the CDN at the world level, plus gives you the list of IPs the CDN resolves to in different countries.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am looking for a third-party service or tool that can trigger or hit a web page at scheduled times. Does anyone know of any? I currently use my windows scheduler to hit these pages, but there are gaps in this since I don't run my computer 24 / 7.
There are lots of free website monitoring services out there that check your site's availability by testing whether a given URL responds properly. One that I use is http://mon.itor.us/ Give it the URL of your page and that will do the trick.
One other alternative is Pingdom. They offer a free account for monitoring one web site. You can set Pingdom to send email or SMS if your site goes down and you can configure the service to hit your page for example once in every 5 minutes. You can set the check resolution quite freely.
Some other alternatives are MonitorUs (which RichieHindle already mentioned), SiteUptime and HyperSpin.
I created an open source app called Pinger. You can do unlimited URLs with intervals of your choosing. The docs has instructions for getting running on Heroku quickly:
https://github.com/austinthecoder/pinger
Only problem is, currently, you can only set an interval, not a specific time.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I am trying to find out if I can get notified when a site is down, or when a service running under WAS is no longer running.
I don't want to code a monitoring tool, I am sure there must be something out there...
I'm using this
http://tools.pingdom.com/
Beside the cool online tools you have a subscription for monitoring your site.
I found a couple more, haven't used them. These seem totally free while pingdom tools is only free for one site.
http://www.uptimerobot.com
http://ezinedesigner.com
I also had this need, so I created an open source app called Pinger. You can do unlimited URLs with intervals of your choosing. The docs has instructions for getting running on Heroku quickly:
https://github.com/austinthecoder/pinger
I personally use Content Site Monitor. It has a really simple and cool web interface that allows you to view your site’s up-time statistics on a desktop or mobile screen. It’s easy to configure your monitoring parameters as well.
It doesn't just ping your server to make sure that it's alive. It allows you to specify certain content/keywords that you want to monitor. It will send you alert email if the content/keywords are missing from your site or if your site goes down.
Best of all, it’s free to monitor up to 3 sites!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
What are tools you would recommend for analyzing VPS performance? Specifically I run several Wordpress blogs on a VPS and would like to find some tools to help me find issues worth looking into (so I can address them).
I use New Relic for getting some data on where to focus some attention on our Rails applications and find it very useful.
I have a feeling there might be issues with memory use. I would love to see something that showed memory use over time so I could see what my memory use looks like. I am considering upgrading the VPS and this would be one useful piece of data.
"top" is always a good place to start in the terminal.
but http://scoutapp.com/ is a really good monitoring app, and easy to implement on vps servers
Try Munin, is a great monitoring app with a web interface, http://munin-monitoring.org/
cloudstats.me
works free for three VPS
New Relic has a free server monitoring service that will show you CPU, memory, network I/O, disk I/O, and disk capacity, as well as a process list a la top.