Find out the volume of data sent through my ASP site - asp-classic

I'd like to be able to see the web pages I'm serving on my Classic ASP site and how much data is sent out in preparation to start using GZip compression on the server. Running Windows Server 2003.
Is there a tool/utility/script to be able to watch or log traffic and tell the bytes going ou?

Diodeus is right in saying that you need a web log analyzer.
My current webhost uses SmarterStats which is has a large range of customisable reports available and is very good for looking at things like traffic volume etc as it'll visualise it all in the browser for you.
If you are running your own server then you can get a free edition which can be used with just one website - http://www.smartertools.com/smarterstats/free-web-analytics-seo-software.aspx

You need a log analyzer for IIS. Webtrends used to be quite popular. I used it a dog's life ago. Most use Google Analytics these days, but it's a different beast and tracks traffic, not data transfer volume. You really need to look at the server logs for that.

Related

How to count incoming requests on IIS?

I need to compute the rate of requests (requests/second) arriving to a IIS web server.
I am pretty sure that IIS maintains this information internally.
I have spent a reasonnable amount of time trying to find a way to configure IIS so that it write the rate down to its log file.
Guest what ? I was defeated.
So I have decided to get some help.
Does someone know how to make IIS exhibit the arriving requests rate ?
Since there isn't an accepted answer, here is how to count this information natively using IIS:
Open ISS and on your home server, access the "Logging" option as showed below. By default, ISS will log every HTTP request that it receives, the hour and some data about the origin also.
Windows Performance counter has this info too.

Load testing services to load CDN

Any service can be used to load test a website on CDN? So that we can ensure our website still can run without problem even under high volume traffic for example DDoS.
I suppose the target service should able to generate huge amount concurrent connections and large bandwidth.
If there are any reference site or report, please guide me to.
Thanks all.
Check out http://loadimpact.com/ and https://www.blitz.io/
Or you can always use tools like siege and ab

Simulating a remote website locally for testing

I am developing a browser extension. The extension works on external websites we have no control over.
I would like to be able to test the extension. One of the major problems I'm facing is displaying a website 'as-is' locally.
Is it possible to display a website 'as-is' locally?
I want to be able to serve the website exactly as-is locally for testing. This means I want to simulate the exact same HTTP data, including iframe ads, etc.
Is there an easy way to do this?
More info:
I'd like my system to act as closely to the remote website as possible. I'd like to run command fetch for example which would allow me to go to the site in my browser (without the internet on) and get the exact same thing I would otherwise (including information that is not from a single domain, google ads, etc).
I don't mind using a virtual machine if this helps.
I figured this was quite a useful thing in testing. Especially when I have a bug I need to reliably reproduce in sites that have many random factors (what ads show, etc).
As was already mentioned, caching proxies should do the trick for you (BTW, this is the simplest solution). There are quite a lot of different implementations, so you just need to spend some time selecting a proper one (according to my experience squid is a good solution). Anyway, I would like to highlight two other interesting options:
Option 1: Betamax
Betamax is a tool for mocking external HTTP resources such as web services and REST APIs in your tests. The project was inspired by the VCR library for Ruby. Betamax aims to solve these problems by intercepting HTTP connections initiated by your application and replaying previously recorded responses.
Betamax comes in two flavors. The first is an HTTP and HTTPS proxy that can intercept traffic made in any way that respects Java’s http.proxyHost and http.proxyPort system properties. The second is a simple wrapper for Apache HttpClient.
BTW, Betamax has a very interesting feature for you:
Betamax is a testing tool and not a spec-compliant HTTP proxy. It ignores any and all headers that would normally be used to prevent a proxy caching or storing HTTP traffic.
Option 2: Wireshark and replay proxy
Grab all traffic you are interested in using Wireshark and replay it. This I would say it is not that hard to implement required replaying tool, but you can use available solution called replayproxy
Replayproxy parses HTTP streams from .pcap files
opens a TCP socket on port 3128 and listens as a HTTP proxy using the extracted HTTP responses as a cache while refusing all requests for unknown URLs.
Such approach provide you with the full control and bit-to-bit precise simulation.
I don't know if there is an easy way, but there is a way.
You can set up a local webserver, something like IIS, Apache, or minihttpd.
Then you can grab the website contents using wget. (It has an option for mirroring). And many browsers have an option for "save whole web page" that will grab everything, like images.
Ads will most likely come from remote sites, so you may have to manually edit those lines in the HTML to either not reference the actual ad-servers, or set up a mock ad yourself (like a banner image).
Then you can navigate your browser to http://localhost to visit your local website, assuming port 80 which is the default.
Hope this helps!
I assume you want to serve a remote site that's not under your control. In that case you can use a proxy server and have that server cache every response aggressively. However, this has it's limits. First of all you will have to visit every site you intend to use through this proxy (with a browser for example), second you will not be able to emulate form processing.
Alternatively you could use a spider to download all content of a certain website. Depending on the spider software, it may even be able to download JavaScript-built links. You then can use a webserver to serve that content.
This service http://www.json-gen.com provides mock for html, json and xml via rest. By this way, you can test your frontend separately from backend.

Hyperlinking to a nonserver machine

My church has a management software that we would like to have several people access at the same time but over the internet. We have a website but it is hosted by another company. Is it possible to create a hyperlink on our website to access this program on our office computer? The hyperlink would be setup so that it is not visible to those who don't have access to the program. We have tried several remote access program such as TeamView and Go2MyPC. These give access to the entire computer and that is not something we want either. If we can't do the hyperlink is it possible for us to turn this computer into a server and access it that way> Again the focus is for at least 5 people to be able to use this software at the same time should that need arise.
Our church management software is designed to be run on a network. We have already setup the user IDs and passwords for the group who currently have access to the it. The problem is that we only have one office computer and all of or group can't use it at one time because the each have access to different parts. I.e. Treasurer can only acces financial module, clerk can only access membership roster and so on. The goal is to find the path of least resistance that will allow as many of these people to access this software at the same time as possible remotely. I understand the security issues so to that I ask if anyone thinks we should get another computer to make into a server or turn this on we have into one.
Is there an advantage to hosting our own website on our machine where the management software is already located?
You actually have three different problems:
You are trying to use a "nonserver machine" as a server. I assume it doesn't have a static IP address; a static hyperlink will fail. Since you have a web site, you can set up your Management Software Server (for lack of a better name) to check in with your web site server, which will hold the latest IP address for your MSS. Your users can then check in with the web server and then connect to your MSS.
You are trying to keep most people out, but some people in. You need an authentication scheme. This usually means a login name and password, and a secure way to transmit it. SSL is probably what you want to look at. You'll need a SSL certificate (you can make your own), and a client program (web browsers work) that can make SSL connections.
You are trying to allow your users to do only some things, but not others. You need an authorization scheme. This is provided by the server application, such as the Windows Remote Desktop. Without knowing how much granularity you need, it's hard to say exactly what you need.
#bdares hit a bunch of your issues right on the head...
And then some... As a church, sounds of funding already ringing that they won't have that type of funding to handle... especially opening up a channel to the church management and possible accounting. Even if offering your own SSL certificate, getting hacked is getting easier and easier. If its a low budget operation with financial data readily easy for the taking, I'd hate to see a church (or any other legitimate non-profit organization) get messed up.
There are a lot of security issues to deal with and you can NOT take it lightly.

What tools does your company use to manage application performance of asp.net applications?

I am not talking about application profilers or debuggers but more specific to managing the applications in production environment. So essentially monitor, identify bottlenecks, deploy fixes.
For monitoring the application is up and running we use Nagios.
We also use good old performance monitor for monitoring database connections, memory consumption and CPU usage.
We use IPMonitor to verify uptime, and it has a lot of options for pinging the site for keyword validation, HTTP response validation, and response time. You can also use SNMP to figure out responsiveness of the processor and RAM, and remaining size on hard disks, among many other options. It supports multiple servers and types of servers, not just website or database.
Additionally, we test basic uptime and response speed with AlertSite.
A 3rd party, Keynote, tests our sites to verify that they are navigable like a human would browse. They have scripts to mimic clicks and interactions.
We use Spotlight for SQL server management, and also good old perfmon for the granular problem fixing.
We recently purchased WildMetrix to monitor and troubleshoot performance issues for our ASP.NET applications. It's nice because you can easily aggregate IIS, ASP.NET, and SQL Server information into a single graph or dashboard that allows you to pinpoint possible trouble spots. We currently use it for as our primary performance reporting and track tool, along with ELMAH for Exception Tracking.

Resources