Is there a simple app for pinging a list of websites? [closed] - status

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
Basically, I just need a simple app that frequently pings external IP Addresses and web addresses to make sure the sites are up. Does anyone know of a good one of these?
I started to make one myself, but wanted to know if someone else has already done the work.
It just needs to track multiple external addresses with the status codes returned, at potentially different intervals.
I did see this post on "How do you monitor the availability of multiple websites", but it seems a little bit like overkill for what I need. I need a KISS app! Thanks!

Ok, second attempt. What about Website Monitor (seen in this list: Monitor and Check Web Site or Server Uptime and Availability for Free)? Your dog should be able to use it.

I'm not sure if this fits your needs but
http://aremysitesup.com/
May be a simple way to go.
The free version supports up to five sites.

This can be done with Cacti which is a great app. See:
Http Response Time monitoring and Alerting on the Cacti forums
How do you Monitor a https website and graph uptime/latency? on the Cacti forums
Cacti: Using Cacti to monitor web page loading blog posts serie
Use Cacti to Monitor HTTP Status Codes of Request Responses?) here on SO

Unless you are the network admin of those sites it is a colossal waste of resources, what I call ping-then-do.
Ping-then-do

use command prompt if you are on a windows system.
type in :
ping (website host name)
and then press enter, it will ping the website and give you the time that the website took to respond as well as the TTL

Related

How to set a mandatory homepage for an open Wifi Network? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I want to add a start page, publicity for exemple, to an open wifi network before getting acces to any site or aplication.
any solutions? Thank you.
You probably want to look at routers that support 'captive portals' out of the box or routers that can be flashed with 'DD-WRT' firmware, of which there are quite a few around. You could look here for more info.
In terms of showing a specific page, the router needs to track the MAC address or the DHCP-assigned IP address and remember if the user has 'logged in'. A login in your case might be just clicking a button on your advertising page.
As an aside, typically, mobile devices will try to go to some specific URL like www.google.com/generate_204 or look for a specific response when they request something with User-agent: CaptiveNetworkSupport (iOS), and when they don't get the response they expect (because your router will be responding with a different page or blocking access), they will correctly assume they are in a captive portal and just show the user that page that the router returned, which will be your advertising page.
Edit: You could also try this out: NoCatSplash in the DD_WRT firmware, which looks like it would do what you want.

Raspberry Pi server w/out port forwarding [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I would like to remote into my Pi from outside my home network. The problem is that my apartment provides me with wireless internet and I can't access the router to enable port-forwarding. Is there any way around this? A dynamic dns service perhaps?
I would like to use VNC SSH and/or FTP.
I usually use Team Viewer to remote a station behind NAT without activating port forwarding like in your situation.
You need to create an account on you team viewer application, and register your target station's team viewer to your account.
When you are away, please make sure to always open your target station's team viewer and you will be able remote your target station by first login to your team viewer account. Once you are logged in you will have a list of target station that you have registered. Simply double click one of the list and you can remote your target station.
VNC or SSH will not be able to work behind NAT without activating port forwarding because the router will try to open it's own port instead of your target station.
I was hoping to install Teamviwer in the Raspberry pi as well, and I sent an email off to the temviewer team. This was my response:
Hi Drano
Thank you very much for your message.
Teamviewer does not support ARM architecture. I will forward your suggestion to our product management. Such ideas are always welcome, although I can not promise when or if this Feature will be implemented, as the decision is based on public demand. Nevertheless, your feedback is very important to us as we want to continue to develop TeamViewer based on our user's needs and demands. We will be happy to inform you about realization of this feature.
If you have any further questions on our product, please feel free to contact us.
Best regards,
Harun Rashid
-Support Technician-
P.S.: TeamViewer 9 is ready!

Is there a way I can replicate http traffic to a redundant server? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I will be moving a high load prod system over to new hardware over the next few weeks. However in the mean time I would like to validate that the new hardware will handle the expected loads. I would really like to stick some kind of 'proxy' infront of the current web server and copy all that http traffic to the new environment, i.e. run them both in parallel.
Ideally this proxy would also validate that the responses are the same.
I can then monitor the new hardware stats (cpu, mem, etc) and see if it looks ok.
What is this kind of proxy called? Any one have any suggestions? This is for a Windows .Net (asp.net) and SQL server environment.
Thanks all
Varnish comes to mind - https://www.varnish-cache.org/
Edit
I'd actually use nginx... (two years experience after answering this question).. varnish would be silly to use. nginx would definitely be the better option.
Have a look a JMeter. It's Java based but allows you to record user journeys and play them back in bulk for stress testing.

Writing a cache-everything/quick-response HTTP proxy [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
Are there any open source HTTP caching proxies I can use to give myself a good starting point?
I want to write a personal HTTP caching proxy to achieve the following purposes
Serve content instantly even if the remote site is slow
Serve content even if the network is down
Allow me to read old content if I'd like to
Why do I want to do this?
The speed of Internet connection in my area is far from spectacular.
I want to cache contents even if the HTTP headers tell me not to
I really don't like it when I couldn't quickly access content that I've read in the past.
I feel powerless when a website removes useful content and I find no way to get it back
The project comprises
A proxy running it on the local network (or perhaps on localhost), and
A browser plugin or a desktop program to show content-updated notifications
What's special about the proxy?
The browser initiates an HTTP request
The proxy serves the content first, if it's already in the cache
Then the proxy contacts the remote website and check whether the content has been updated
If the content has been updated, send a notification to the desktop/browser (e.g. to show a little popup or change the color of a plug-in icon), and download the content in the background.
Every time the proxy download new content, save it into the cache
Let me choose to load the updated content or not (if not, stop downloading the new content; if yes, stream the new content to me)
Let me assign rules to always/never load fresh content from certain websites
Automatically set the rules if the proxy finds that (1) I always want to load fresh content from a certain website, or (2) the website's content frequently updates
Note:
Caching everything does not pose a security problem, as I'm the only one with physical access to the proxy, and the proxy is only serving me (from the local network)
I think this is technologically feasible (let me know if you see any architectural problems)
I haven't decided whether I should keep old versions of the webpages. But given that my everyday bandwidth usage is just 1-2 GB, a cheap 1TB hard drive can easily hold two years of data!
Does my plan make sense? Any suggestions/objections/recommedations?
Take a look at polipo:
http://www.pps.univ-paris-diderot.fr/~jch/software/polipo/
Source is here:
https://github.com/jech/polipo
It is a caching web proxy implemented in C. It should definitely help you.

Creating a P2P / Decentralized file sharing network [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I was wondering where I could learn more about decentralized sharing and P2P networks. Ideally, I'd like to create something to help students share files with one another over their universities network, so they could share without fear of outside entities.
I'm not trying to build the next Napster here, just wondering if this idea is feasible. Are there any open source P2P networks out there that could be tweaked to do what I want?
Basically you need a server (well, you don't NEED a server, but it would make it much simplier) that would store user IPs between other things like file hash lists, etc.
That server can be in any enviroinment you want (which is very comfortable).
Then, each client connects to the server (it should have a dns, it can be a free one, I've used no-ip.com once) and sends basic information first (such as its IP, and a file hash list), then sends something every now and then (say each 5 minutes or less) to report that it's still reachable.
When a client searchs files/users, it just asks the server.
This is a centralized network, but the file sharing would be done in p2p client-to-client connections.
The reason to do it like this is that you can't know an IP to connect to without some reference.
Just to clear this server thing up:
- Torrents use trackers.
- eMule's ED2K uses lugdunum servers.
- eMule's "true p2p" Kademlia uses known nodes (clients) (most of the time taken from servers like this).
Tribler is what you are looking for!
It's a fully decentralized BitTorrent Client from the Delft University of Technology. It's Open Source and written in Python, so also a great starting point to learn.
Use DC++
What is wrong with Bit-Torrent?
Edit: There is also a pre-built P2P network on Microsoft operating systems that is pretty cool as the basis to build something. http://technet.microsoft.com/en-us/network/bb545868.aspx

Resources