Any suggestions for good automated web load testing tool? - asp.net

What are some good automated tools for load testing (stress testing) web applications, that do not use record and replay of HTTP network packets?
I am aware that there are numerous load testing tools on the market that record and replay HTTP network packets. But these are unsuitable for my purpose, because of this:
The HTTP packet format changes very often in our application (e.g. when
we optimize an AJAX call). We do not want to adapt all test scripts just because
there is a slight change in HTTP packet format.
Our test team shall not need to know any internals about our application
to write their test scripts. A tool that replays HTTP packets, however, requires
the team to know the format of HTTP requests and responses, such that they
can adapt details of the replayed HTTP packets (e.g. user name).
The automated load testing tool I am looking for should be able to let the test team write "black box" test scripts such as:
Invoke web page at URL http://... .
First, enter XXX into text field XXX.
Then, press button XXX.
Wait until response has been received from web server.
Verify that text field XXX now contains the text XXX.
The tool should be able to simulate up to several 1000 users, and it should be compatible with web applications using ASP.NET and AJAX.

JMeter I've found to be pretty helpful, it also has a recording functionality to record use cases so you don't have to specify each GET/POST manually but rather "click" the use case once and then let JMeter repeat it.
http://jmeter.apache.org/

A license can be expensive for it (if you dont have MSDN), but Visual Studio 2010 Ultimate edition has a great set of load and stress testing tools that do what you describe. You can try it out for free for 90 days here.

TestMaker by PushToTest.com can run recorded scripts such as Selenium as well as many different languages like HTML, Java, Ruby, Groovy, .Net, VB, PHP, etc. It has a common reporting infrastructure and you can create load in your test lab or using cloud testing environments like EC2 for virtual test labs.
They provide free webinars on using open source testing tools on a monthly basis and there is one next Tuesday.
http://www.pushtotest.com

There are a few approaches; I've been in situations, however, where I've had to roll my own load generating utilities.
As far as your test script is concerned it involves:
sending a GET request to http://form entry page (only checking if a 200 response is given)
sending a POST request to http://form submit page with pre-generated key/value pairs for text XXX and performing a regexp check on the response
Unless your web page is complex AJAX there is no need to "simulate a button press" - this is taken care of by the POST request.
Given that your test consists of just a 2-step process there should be several automated load packages that could do this.
I've previously used httperf for load testing a large website: it can simulate a session consisting of several requests and can simulate a large number of users (i.e. sessions) simultaneously. For example, if your website generated a session cookie from the home page you could make that the first request, httperf would then use that cookie for subsequent requests, until it had finished doing the list of requests supplied.

What about http://watin.sourceforge.net/ ?

Related

How to Perform Stress/Load test on a chrome extension using Jmeter?

The title explains it.
I have a chrome extension that I have been working on, which shows relevant dcouments and data from a DB related to the webpage open on the main tab. And now I have to stress test it by checking how many users can it handle at once after logging in and clicking on the search "All" documents.
I have been trying to find some good tutorials, but all i get is testing using different jmeter extensions for chrome.
If jmeter can't be used for stress testing an extension, can you share a better alternative for the task?
The title doesn't explain it.
How many users can use a chrome extension concurrently? Only one. Therefore Stress/Load test is not applicable, what you can do is to profile it.
If you want to add test automation on top of it, i.e. to measure how fast it opens, switches between screens/views, react on clicks, etc. you can consider using WebDriver Sampler (can be installed using JMeter Plugins Manager) but again it will be only simulation of 1 user.
Another story is load testing the backend, for example if chrome extension is connecting to some remote server and you need to check how does the server handle hundreds/thousands of the extension users. In this case most probably you will be able to use JMeter to simulate multiple concurrent users using the chrome extension by replicating their network footprint using appropriate JMeter Samplers, in case of HTTP protocol you can even record the requests using JMeter's HTTP(S) Test Script Recorder and replay it with increased number of users.

use webservice in same project or handle it with code?

This is a theoretical question.
imagine an aspnet website. by clicking a button site sends mail.now:
I can send mail async with code
I can send mail using QueueBackgroundWorkItem
I can call a ONEWAY webservice located in same website
I can call a ONEWAY webservice located in ANOTHER website (or another subdomain)
none of above solutions wait for mail operation to be completed.so they are fine.
my question is why I should use service solution instead of other solutions. is there an advantage ?
4th solution adds additional tcpip traffic to use service its not efficient right ?
if so, using service under same web site (3rd solution) also generates additional traffic. is that correct ?
I need to understand why people using services under same website ? Is there any reason besides make something available to ajax calls ?
any information would be great. I really need to get opinions.
best
The most appropriate architecture will depend on several factors:
the volume of emails that needs to be sent
the need to reuse the email sending capability beyond the use case described
the simplicity of implementation, deployment, and maintenance of the code
Separating out the sending of emails in a service either in the same or another web application will make it available to other applications and from client side code. It also adds some complexity to the code calling the service as it will need to deal with the case when the service is not available and handle errors that may occur when placing the call.
Using a separate web application for the service is useful if the volume of emails sent is really large as it allows to offload the work to one or servers if needed. Given the use case given (user clicks on a button), this seems rather unlikely, unless the web site will have really large traffic. Creating a separate web application adds significant development, deployment and maintenance work, initially and over time.
Unless the volume of emails to be sent is really large (millions per day) or there is a need to reuse the email capability in other systems, creating the email sending function within the same web application (first two options listed in the question) is almost certainly the best way to go. It will result in the least amount of initial work, is easy to deploy, and (perhaps most importantly) will be the easiest to maintain.
An important concern to pay significant attention to when implementing an email sending function is the issue of robustness. Robustness can be achieved with any of the possible architectures and is somewhat of an different concern as the one emphasized by the question. However, it is important to consider the proper course of action needed if (1) the receiving SMTP refuses the take the message (e.g., mailbox full; non-existent account; rejection as spam) and (2) an NDR is generated after the message is sent (e.g., rejection as spam). Depending on the kind of email sent, it may be OK to ignore these errors or some corrective action may be needed (e.g., retry sending, alert the user at the origination of the emails, ...)

How to do live self monitoring inside the application

We are applying unittests, integration tests and we are practicing test driven and behaviour driven development.
We are also monitoring our applications and servers from outside (with dedicated software in our network)
What is missing is some standard for a live monitoring inside the apllication.
I give an example:
There should be a cron-like process inside the application, that regularily checks some structural health inside our data structures
We need to monitor that users have done some regular stuff that does not endanger the health of the applications (there are some actions and input that we can not prevent them to do)
My question is, what is the correct name for this so I can further research in the literature. I did a lot of searching but I almosdt always find the xunit and bdd / integration test stuff that I already have.
So how is this called, what is the standard in professional application development, I would like to know if there is some standard structure like xunit, or could xunit libraries even bee used for it? I could not even find appropriate tagging for this question, so please if you read this and know some better tags, why not add them to this answer and remove the ones that don't fit.
I need this for applications written in python, erlang or javascript and those are mostly server side applications, web applications or daemons.
What we are already doing is that we created http gateway from inside the applications that report some stuff and this is monitored by the nagios infrastructure.
I have no problem rolling some cron-like controlled self health scheme inside the applications, but I am interested about knowing some professional standardized way of doing it.
I found this article, it already comes close: Link
It looks like you are asking about approaches how to monitor your application. In general, one can distinguish between active monitoring and passive monitoring.
In active monitoring, you create some artificial user load that would mimic real user behavior, and monitor your application based on these artificial responses from a non-existing user (active = you actively cause traffic to your application). Imagine that you have a web application which allows to get weather forecast for specific city. To have active monitoring, you will need to deploy another application that would call your web application with some predefined request ("get weather for Seattle") every N hours. If your application does not respond within the specified time interval, you will trigger alert based on that.
In passive monitoring, you observe real user behavior over time. You can use log parsing to get number of (un)successful requests/responses, or inject some code into your application that would update some values in database whenever successful or not successful response was returned (passive = you only check other users' traffic). Then, you can create graphs and check whether there is a significant deviation in user traffic. For example, if during the same time of the day one week ago your application served 1000 requests, and today you get only 200 requests, it may mean some problem with your software.

How to Automate Web Analytics testing?

Omniture/SiteCatalyst's code is integrated onto the webpage to collect the analytics in our firm.
Current process: SiteCatalyst id deployed by pasting HTML code onto each page of the website. This HTML code contains variables and other identifiers that facilitate the data collection process. These variables may be dynamically populated with server or application variables. The code snippet also calls the JavaScript library file, which contains SiteCatalyst-specific JavaScript functions used during metrics collection.
We use Add-on's like Charlie, HTTP Post, DigitalPulse Debugger to Test if the code inserted has accurate values corresponding to it. This process is time consuming and tedious.
How to Automate this process? Any help would be appreciated!
Example 1:
Click here to send a page view
s.pageName="New Page"
s.prop1="some value"
void(s.t());
Example 2:
s=s_gi('myreportsuiteid');
s.linkTrackVars="prop1,eVar1,events"; s.linkTrackEvents="event1";
s.prop1="some value"; s.eVar1="another value"; s.events="event1";
s.tl(this,'o','My Link Name');
There are a few different ways to automate testing. I've been looking into it lately myself. So far I'm looking into Selenium, Zombiejs and Phantomjs. You can search for "headless testing" which basically let's run code as a browser and test conditions on the page you visit.
Here's a good place to start https://github.com/ariya/phantomjs/wiki/Headless-Testing
Using these platforms, you could easily set pages to automatically validate if the SiteCatalyst code is firing, page names are correct, click events happen etc.
Selenium is an enterprise product whereas the JS frameworks would be more of a development effort.
we usually do this using a more customizable proxy application called Fiddler which we use to capture all the traffic sent from our brower.
Fiddler has an internal scripting language that let you make any type of check on the data passing in the Adobe Analytics call and highlight in the interface any bad call.

How to handle IMAP requests from MSOutlook in ASP.NET page?

Brief: I am tinkering with a personal project that would serve up Task objects to MSOutlook. I would like to create a new HTTP account in MSOutlook which points at my website's *.aspx page. This page would deliver a list of Task items that do not actually reside on a mail server but are instead stored in a XML file or other simple structure.
Question: Are there any guides for handling IMAP requests in ASP.NET? I've found plenty of information on developing a web client but I want something more akin to a server/service though nothing so robust.
Background: My daughter is in high school. She is computer literate but abhors complexity and all nerdiness. She is comfortable with MSOutlook so I would like to run a little website in my house to send homework Tasks to her. If I can set up an HTTP account, the Tasks will be delivered to her without any trouble on her part. Don't get me started on the screen scraping I'm doing to retrieve assignments from her teacher's "websites" (I don't think the term could be applied any more loosely without completely falling off).
I think you'd be better off using/customizing an Open Source IMAP server, there are several out here. But I am not sure if the mail server idea is a good one. You'd be bringing a lot of baggage into this effort.
Why don't you just send your daughter an email, as opposed to putting the assignment on a web page and then trying to get it off of there?
If you must have the pull model (as opposed to a push model), why not put up an asp page with a "Send me the assignment" button. She can go there, click on it, and will receive the content in the email.

Resources