Introducing testers to HTTP and Fiddler? - http

We really need to get our testers into using Fiddler to determin page size and site speed as part of their pre release testing process. I have sat with some of them to talk about fiddler but I'm looking for some easy to understand resources for learning Fiddler.
More importantly I think it might be ideal if I can find an easy to read resource that will explain the role of http in using the web what the difference between http and the browser is (maybe explanations of the http and upper layers but in a way a tester can understand).
Does anyone have any suggestions or resource links?
Our testers are mostly from the point and click UA testing school rather than the more dynamic end of things. All help greatly appreciated.
thanks,
b

A video well worth having them watch is PDC 2009 session on Fiddler by the Author of the product Eric Lawrence.

If you want your testers to do this then it would be better if you set up something like ShowSlow with Yslow and when they are working through the site it will automatically record the data for you. This will remove any potential issues for them forgetting to do this step.
If you want to automate the process you can use Selenium and YSlow. I did a talk last year at Google Test Automation Conference where I discussed the process of doing this. The talk can be found here

If you're only looking to ensure page size is reasonable then I don't believe that knowledge of HTTP is necessary. You're better of training them on yslow or google page speed. These tools provide a higher level view of why a page is loading slowly and what can be done to mitigate it. All they need to know about HTTP is that bigger pages and more requests = slower loading. That is true for all network protocols, though.

A good resource I started with was Whittaker's "How to Break Web Software: Functional and Security Testing of Web Aplications and Web Services". This provides a good introduction to performing Web testing "under the covers", so to speak.
While, more geared to debugging, this PDC presentation (
http://microsoftpdc.com/Sessions/CL25 ) may prove to be a nice intro to Fiddler for your QA folks.

No matter what tool you are introducing into your organization it's good to have some basic Software Roll out concepts in mind while doing this. There are a lot of resources out there describing software roll out plans and although most of them describe how to introduce software in organizations with hundreds or thousands of users there are things to learn even if we are 'only' talking about a dozen or testers or so.
Some ideas that I think might fit you are:
Evangelist
Talk to testers and try to find one or two people who are more interested and enthusiastic about the Fiddler idea then the rest of your team. Give him/her/them time (payed working hours) to learn more about tool and to do a presentation about it to the other tester. Make sure it's someone that the other testers know and respect.
Pilot
Do a pilot project ( who better to be in charge of this than your Evangelist/s ) as a proof of concept. The pilot project should be limited to a small part of a system and the test should be of the nature that you can throw away if it does not work out. The pilot could be time-boxed and there should be an evaluation with the whole team afterwords. This will give your organization some experience in Fiddler and learn to avoid the big beginner mistakes. And it will, hopefully, show the rest of the testing team that Fiddler (or what ever tool you choose) is pretty cool and get excited about it.
Training
Of course you should dedicate time to do training properly. Just saying "read this easy to read document then start testing" is probably not going give much to the testers. Buy books on Fiddler. Let your Evangelist have a 2-hour "getting up and running with Fiddler" tutorial.
Incremental Roll out
Instead of going for a Big-bang approach where you tell you tester to start using Fiddler to test everything you should start with only limited number of test for the first release. And then you make some more tests for the next release, and maintain the first once. After a few releases you'll have a whole heap of stable and good tests using Fiddler. This way it won't take to much time of the tester so they can still do their other testing.
Read More
There are plenty of articles about software roll out plans on the web that can help you with this.
Hope this helps
/Jonas

Related

When to choose LAMP over ASP.NET?

A friend wants to start a dating website, she wants me to help her. We still haven't discussed on what platform it'll be developed, but I'm thinking she'll suggest LAMP to save a buck (which is one reason already to chose over ASP.NET already). If the dating website does well, it'll potentially hold a large amount of data (I'm not sure if this would be another reason to consider either ASP.NET or LAMP).
Anyway, I ask this from an ASP.NET developer point of view. I have very little, almost null experience with LAMP, and I don't like it very much either, so if she decides to go with PHP odds are I won't help her. So what would be some good points to bring up when deciding which platform to develop on?
Please be objective, I don't want this to be argumentative or anything, try to stick to facts, not opinions alone.
Thanks!
What generally matter in that kind of choice is :
How much time will it require ?
How much money will it cost
Which is often linked to the time ^^
If you have a lot of experience with .NET and none with Linux/Apache/PHP/MySQL, choosing LAMP will mean that you'll need much more time : a whole lot of new stuff to learn.
It'll also mean that your code will probably not be as good as it would be with what you know.
After, the question is : do a couple of week "cost" more than a few licences ?
Only you and her can decide, there ;-)
If LAMP makes you queasy, you can try ASP.NET over Mono.
IMO the only good reason to move away from a programming environment that you are already experienced with is the one you already mentioned: cost.
You would use LAMP specifically to build appliances. If you're not building appliances, the software cost for ONE server is marginal, and is not worth the tradeoff for moving to a totally different development environment, IMO.
I think the first question is: Which is the target programming language and environment that you have experience with?
Imagine the site will become a success - how do you scale then? LAMP can scale, and so can WISC, but in both scenarios you need people who actually know the environment and who can secure it. If you don't know Linux and MySQL and PHP, how are you going to scale and secure it?
So even though LAMP may be significantly cheaper (The SQL Server license is the heavy part in the WISC stack), after the first hacker attack or downtime, that initial savings may seem marginal compared to the damage.
The other thing is of course the PHP vs. ASP.net/C# decision. If you don't know PHP, then it's a decision of "Not having the application at all" and "Having the application on an expensive stack", unless your partner of course decides to hire someone else to develop that.
Technically, both have their pros and cons, but there are huge websites built on both stacks, so it really boils down to "Which platform can you reliably/comfortably setup and maintain?"
I agree with Pascal. Go with what you feel comfortable with in completing the project and don't forget that YOUR TIME EQUALS MONEY. You have to put a $$ value on your time. LAMP may be cheaper up front but if it winds up taking 1000 extra manhours, then suddenly it's more expensive.
Also take into account the lost opportunity cost in not being able to bring something to market b/c you chose a technology you were not familiar with.
At the end, if the plans are for this to be a business that is successful, the cost of using ASP.NET should be negligible or else I would question the seriousness of the effort.
One argument for the Apache/MySQL/PHP stack is that it's available on most major platforms (Windows/Linux/Mac/BSD/...) and most webhosters provide it as well.
You also find many (as in "huge amounts") of good tutorials, books and other educational stuff about PHP/MySQL.
Apart from that all tools used in the LAMP stack are free (as in "free speech" and also as in "free beer"). ASP.NET is still a proprietary technology owned by Microsoft. I'm not a huge open source fan, but knowing that your tools will remain free to use in any way you want is quite nice.
Of course, if you have no experience with PHP at all and much exp. with ASP.NET it's easier for you to stick with ASP.
If your comfortable with Microsoft products there's nothing to stop you from developing code in .NET and using a free database (however you may need to find/develop a custom database adapter if you are not using free versions of SQL server or Oracle). If you are generating a lot of traffic you can swap out the data layer of your code and invest in a better performing database.
Time costs money and if you can develop a better product both from a user and maintenance/performance perspective it will serve you better in the long run.
Some hosting companies include the OS and flexible contracts so I would make fit from your prespective. The market's pretty competitve for that type of site and there's no point throwing a lot of money at it until you get some useful metrics for your site IMO.
The short answer is: it doesn't matter, unless the site is going to do something so amazingly different that one technology is obviously better suited. And I can't think of anything like that off the top of my head.
A big red flag is: if your friend is concerned about the extra $5/month for asp.net hosting instead of LAMP hosting, then you're probably not going to get paid. Ever.
Caveats aside, be realistic: what is the immediate goal? To get something working, or to design something on the scale of plentyoffish.com or facebook.com? [Facebook.com has about 44,000 servers at the moment]
So, what are the chances of your friend's dating web site exploding to the size where scaling is a concern? For most sites, the answer is "very close to zero" - because of the marketing effort required to drive that much traffic.
Now, what is the revenue stream? Is there any expectation that you will get paid to do this? Do you think the site will be profitable? Is the project fully funded?
Friendship is great, but don't let that keep you from asking the appropriate business and client-relationship questions. One sure way to ruin a friendship is to do some work for free and/or without thinking through the full extent of the project. Far too often, you think it is a one-time favor, while they think it is your job!
LAMP is only cheaper until you read the fine print. It's not better or worse technically, just different.
The WebsiteSpark/BizSpark programs will get you all the Microsoft software you need to get started, free for three years. If price is her driving concern, point her to those programs if she's willing to consider the ASP.NET platform.
Hosting will cost a fair amount either way, because for a full-service website you don't want to go shared. You'll need at least one dedicated server to support a dating site. The OS and database will be free either way if you go with one of the *Spark programs I mentioned.
As a small startup company you can get a free 3-year MSDN subscription (well, you have to pay $100 at the end of the 3 years). If you think .Net will be more efficient and this website will make money, seriously consider BizSpark.
Since you are looking for dating site, check out Markus Frind of plentyoffish.com he is running the largest dating site on .net platform with asp.net and sql.

How to initiate automated testing?

I started as a software engineer at the company I'm currently at. Over time, I was either the only one willing to or capable of taking responsibility for various systems, and so I was "promoted" to being IT Manager. Now, during my time as software engineer, I would create functional tests for the various software modules I would build, and as a result, even today I am able to quickly test various parts of the system that I have worked on. However, there is a large large code base with little to no coverage from the other various developers who have been working here.
Now, as IT Manager, I want to be able to test that all the parts of the system are working, but there is:
A) no budgeted time dedicated to creating code test coverage
and
B) No desire from the "chief software engineer" to start creating testing suites to help me monitor that the software is functioning.
I don't expect the software team to drop everything they are doing and spend 2 weeks creating test suites, but it would be nice if they started expanding the test suite
coverage over time so I can confirm that the various parts of the system are working.
So boiling it down, how do I get the software team to start building test suites?
Other caveats:
A) I'm still asked to do software projects in addition to managing our IT dept (a unix engineer, desktop support guy, and related office and production equipment)
B) My unix admin has a really hard time getting production systems up running the full code base, and we aren't getting good help from the software team. He can't run any kind of diagnostic to see where the web app is failing on the new installs. The VP of the company keeps telling me to go in and do print_r's in the code to see what is happening. This sucks!!!
First, you need to investigate Test Driven Development so that you are comfortable explaining it in terms that your developers will understand, as well as your management. Since you seem to be developing web applications, and you have technical skills, I suggest that you take the plunge and choose an open source tool for testing web applications, get it installed, and start building tests for anything that you develop yourself.
Twill is an example of the kind of testing tool that you would need.
Then, as manager, you need to entice developers to follow your example, and reward them for doing so. And punish them, when they don't use the testing framework and it leads to preventable problems. As soon as you get one such incident, you should be able to get your boss on board, and pick up some momentum.
Overall, remember that the objective is to do less work to get a good result. Cutting corners is a way of doing less work, but leads to the risk of bad, or spectacularly bad results. Keep management informed of the risk levels and potential costs at risk.
Don't just force people to do testing for testing's sake. It has to help them be more productive so choose the first projects for it carefully.
That's a good question. And if there was one correct answer to it, much more software projects would be successful and deliver high quality.
I don't think, that it is a good idea to make such a change top-down. It has to be driven from the developers themselves. So trainings in TDD direction would be good, but that is a long time invest, which takes time.
If you want a faster solution you should consider functional-, acceptance-, and systemtests. With these test you test pretty much the whole application through all layers. If you are developing web applications you should consinder using Selenium to automate your test. It is easy to create test with it (Selenium IDE).
But using only such tests (not Unit-tests) don't give you the advantages coming from TDD.
Automating your tests is crucial.
Do you have a Test or QA team?
I would first start to see if they have Test Cases that they use to qualify the build. If not you will have to develop these test cases to test the core functionality of your product.
The next step would be automating the test cases.
If the application is poorly developed without any troubleshooting tools or debugging features it would be tough until these are added as requirements for next release.
My 2 cents.
I'll have to disagree with michaelkebe- these changes need support from the executive level, in addition to a few key developers, in order to fully succeed.
Without that support, you'll just be some developers who look like they are 'wasting time on writing tests for stuff that already works.'
There needs to be a clear vision, and it needs to be repeated loudly and often.
I'm not necessarily advocating for Agile here, but often times it clicks for business owners.
If you can sell them on that, the things that you're excited about (delivering software fast, easy maintenance, automated testing, etc.) will fall into place.

Who writes the automated UI tests? Developers or Testers?

We're in the initial stages of a large project, and have decided that some form of automated UI testing is likely going to be useful for us, but have not yet sorted out exactly how this is going to work...
The primary goal is to automate a basic install and run-through of the app, so if a developer causes a major breakage (eg: app won't install, network won't connect, window won't display, etc) the testers don't have to waste their time (and get annoyed by) installing and configuring a broken build
A secondary goal is to help testers when dealing with repetitive tasks.
My question is: Who should create these kinds of tests? The implicit assumption in our team has been that the testers will do it, but everything I've read on the net always seems to imply that the developers will create them, as a kind of 'extended unit test'.
Some thoughts:
The developers seem to be in a much better position to do this, given that they know control ID's, classes, etc, and have a much better picture of how the app is working
The testers have the advantage of NOT knowing how the app is working, and hence can produce tests which may be much more useful
I've written some initial scripts using IronRuby and White. This has worked really well, and is powerful enough to do literally anything, but then you need to be able to write code to write the UI tests
All of the automated UI test tools we've tried (TestComplete, etc) seem to be incredibly complex and fragile, and while the testers can use them, it takes them about 100 times longer and they're constantly running into "accidental complexity" caused by the UI test tools.
Our testers can't code, and while they're plenty smart, all I got were funny looks when I suggested that testers could potentially write simple ruby scripts (even though said scripts are about 100x easier to read and write than the mangled mess of buttons and datagrids that seems to be the standard for automated UI test tools).
I'd really appreciate any feedback from others who have tried UI automation in a team of both developers and testers. Who did what, and did it work well? Thanks in advance!
Edit: The application in question is a C# WPF "rich client" application which connects to a server using WCF
Ideally it should really be QA who end up writing the tests. The problem with using a programmatic solution is the learning curve involved in getting the QA people up to speed with using the tool. Developers can certainly help with this learning curve and help the process by mentoring, but it still takes time and is a drag on development.
The alternative is to use a simple GUI tool which backs a language (and data scripts) and enables QA to build scripts visually, delving into the finer details of the language only when really necessary - development can also get involved here also.
The most successful attempts I've seen have definitely been with the latter, but setting this up is the hard part. Selenium has worked well for simple web applications and simple threads through the application. JMeter also (for scripted web conversations for web services) has worked well... Another option which is that of in house built test harness - a simple tool over the top of a scripting language (Groovy, Python, Ruby) that allows QA to put test data into the application either via a GUI or via data files. The data files can be simple properties files, or in more complex cases structured (something like YAML or even Excel) data files. That way they can build the basic smoke tests to start, and later expand that into various scenario driven tests.
Finally... I think rich client apps are way more difficult to test in this way, but it depends on the nature of the language and the tools available to you...
In my experience, testers who can code will switch jobs for a pay raise as developers.
I agree with you on the automated UI testing tools. Every place I've worked that was rich enough to afford WinRunner or LoadRunner couldn't afford the staff to actually use it. The prices may have changed, but back then, these were in the high 5-digit to low 6-digit price tags (think of the price of a starter home). The products were hard to use, and were usually kept uninstalled in a locked cabinet because everyone was afraid of getting in trouble for breaking them.
I worked over 7 years as an application developer before I finally switched to testing and test automation. Testing is much more challenging than coding, and any automation developer who wants to succeed should master testing skills.
Some time ago I put my thoughts on skill matrices in a couple of blog posts.
If interested to discuss:
http://automation-beyond.com/2009/05/28/qa-automation-skill-matrices/
Thanks.
I think having the developers write the tests will be of the most use. That way, you can get "breakage checking" throughout your dev cycle, not just at the end. If you do nightly automated builds, you can catch and fix bugs when they're small, before they grow into huge, mean, man-eating bugs.
What about the testers proposing the tests, and the developers actually writing it ?
I believe at first it largely depends on the tools you use.
Our company currently uses Selenium (We're a Java shop).
The Selenium IDE (which records actions in Firefox) works OK, but developers need to manually correct mistakes it makes against our webapps, so it's not really appropriate for QA to write tests with.
One thing I tried in the past (with some success), was to write library functions as wrappers for Selenium functions. They read as plain english:
selenium.clickButton("Button Text")
...but behind the scenes check for proper layout and tags on the button, has an id etc.
Unfortunately this required a lot of set up to allow easy writing of tests.
I recently became aware of a tool called Twist (from Thoughtworks, built on the Eclipse engine), which is a wrapper for Selenium, allowing plain English style tests to be written. I am hoping to be able to supply this to the testers, who can write simple assertions in plain English!
It automatically creates stubs for new assertions too, so the testers could write the tests, and pass them to developers if they need new code.
I've found the most reasonable option is to have enough specs such that the QA folks can stub out the test, basically figure out what they want to test at each 'screen' or on each component, and stub those out. The stubs should be named such that they're very descriptive as to what they're testing. This also offers a way to crystalize functional requirements. In fact, doing the requirements in this fashion are particularly easy, and help non-technical people really work through the muddy waters of their own though process.
The stubs can be filled in via a combination of QA/dev people. This allows you to CHEAPLY train QA people as to how to write tests, and they typically slurp it up as it furthers their job security.
I think it depends mostly on the skill level of your test team, the tools available, and the team culture with respect to how developers and testers interact with each other. My current situation is that we have a relatively technical test team. All testers are expected to have development skills. In our case, testers write UI Automation. If your test team doesn't have those skills they will not be set up for success. In that case, it may be best for developers to write you UI automation.
Other factors to consider:
What other testing tasks are on the testers' plate?
Who are your customers and what are their expectations related to quality?
What is the skill level of the development team and what is their willingness to take on test automation work?
-Ron

Helps participating in an System Verification Test team getting a better programmer?

I am developing applications for 9 years now - meanly Java. Now am asked to participate in the SVT team for the next release. Overall this means installing complex system setups and running specific user scenarios on these setups as well as doing long runs and load runs.
Overall I am positive about it as I will learn something new. But I am also affraid to loose some grip and knowledge with programming, because of not doing it a lot then.
I know doing programming in side projects such as helping with open source projects will be one alternative, but finding the time on top of a familiy life and a fulltime jop is not that easy.
What do you think, is doing concrete testing work helping getting a better software engineer?
Thanks in advance,
Michael
Testing isn't asside of programming.
You can still program automated systems so you can have recursion testing. From unit tests to real complex automated systems, the best i know is selenium which generates code you can use to build testing scripts in most languages.
There are other tools for non webapps. But I personaly believe that testing is a bit far away from "stoping coding. Unless you're just doing user point-of-view testing.
You can also do error injections which will make you write small singletons to inject them in the memory of your application.
So you can code while testing ;) and learn new stuff also.
Having been in a testing team i think it really helps, because you'll learn to exploit code easily, which will reflect when you build your own API or App at a later date.
I would say it depends on your skill and temparament. Programming knowledge will serve you well while testing. At the same time, I know that it needs a different approach and mindset and is on a completely different career track. You can always keep up your programming skills by writing code for a project you like (even if you have to make one up).

What are you using for Distributed Caching in web farms running ASP.NET?

I am curious as to what others are using in this situation. I know a couple of the options that are out there like a memcached port or ScaleOutSoftware. The memcached ports don't seem to be actively worked on (correct me if I'm wrong). ScaleOutSoftware is too expensive for me (I don't doubt it is worth it). This is not to say that I don't want to hear about people using memcached or ScaleOutSoftware. I'm just stating what I "know" at this point.
So my question is basically this: for those of you ACTIVELY using distributed caching, what are you using, are you happy with it, and what should I look out for?
I am moving to two servers very soon...both will be at the same location. I use caching fairly heavily (but carefully) to reduce the load on my database server.
Edit: I downloaded Scaleout Software's solution. I've coded for it and it seems to work real well. I just have to decide if my wallet will part with the cash for it. :) Anyone have experiences good or bad with ScaleoutSoftware?
Edit Again: It's been a little while since I asked this? Any more thoughts on it? We ended up buying the solution from ScaleOutSoftware and have been happy with it, but I'm curious what others are doing.
Microsoft has a product pending code-named Velocity. It's still in CTP, and is moving slowly, but looks like it will be pretty good. We'll be beating it up in the near future to see how it handles what we want it to do (> 2 million read/writes per hour). Will post back with results.
There is a 100% native .NET, well documented open source (LGPL) project called Shared Cache. Looks like it is not yet mentioned on SO, but it's promising and should be able to do what most people expect from a distributed cache. It even supports different strategies like distributed or replicated caching etc.
I will update this post with more details as soon as I had a chance to try it on a real project.
We're currently using an incredibly simple cache that I wrote in a couple of hours, based on re-hosting the ASP.NET cache in a Windows Service (more info and source code here). I won't pretend it's anywhere near as optimised as something like Memcached but we were just looking for something simple and free until Velocity came along, and it's held up extremely well even under fairly heavy load.
It comes down to our personal preference for core components - i.e. ones that affect whether the site is available or not - that they are either (a) supported by a vendor with a history of rapid and high quality support, or (b) written by us so that if something goes wrong we can fix it quickly. Open source is all well and good, and indeed we do use some OSS, but if your site is offline then unfortunately newsgroups et al don't have a 1 hour SLA, and just because it's OSS doesn't mean you have the necessary understanding or ability to fix it yourself.
We are using the memcached port for Windows and we are very pleased with it. The enyim.com memcached client API is great and easy to work with. It's also open source, which is a big advantage, if you ask me.
We are now using this setup in a production web-app and it has helped a lot in improving its performance.
There's a great .NET wrapper/port found here on Codeplex. Awesomesauce!
We use memcached with the enyim library in a production environment (www.funda.nl). Works fine, very pleased with it, but we did notice a substantial raise in CPU use on the clients. Presumably due to the serializing/deserializing going on. We do around 1000 reads per second.
One tried and tested product by 100's of customers worldwide is NCache. Its
a feature rich product that lets you store session state in a redundant and highly available manner, lets you share data
within the enterprise as well as bridging for WAN communication essentially acting as a data fabric and lastly it lets you build an elastic caching tier so that when
your application scales, you can add servers to the cache and actually boost performance further.

Resources