Related
We have been working on a incremental project for like 4 or 5 years using the technologies mentioned at the end.
The project has been growing, and now i feel our methodology is not effective enough. Until now every programer that has worked on the project has had to learn the entire layer structure and technologies surrounding them, and every new feature is assigned to a single person.
So we are delaying on delivery times, its really hard to train someone and make them productive, and people on the team feel overwhelmed, i don´t think is a matter of money and resources, a debate is on, and i really feel like we should work in pairs and in layers, becoming specialized in certain areas and working in teams. How ever some argument that we can´t work in layers because a person might not be able to finish his part because he won´t be able to test it until the other member is over with his layer. Right now we are only 3 programers.
So if you think that these suggestions make sense, what i need is some nutshell effective references of how can we turn this in a more positive dynamic as a team, how to work on layers with these technologies, i need to have practical solutions and arguments so we can turn the ship to the right direction. Can any one direct us to the right direction ? it will be deeply appreciated. Thank you in advance!
Technologies:
Backend: Java + Spring + Hibernate + Mysql
Frontend HTML: Jstl + html
Frontend Flex: Flex SDK 3.5 + Blaze DS, cairngorm, third party libraries and sources.
Development OS: Mac or windows
Development Tools: Trac for management, svn repository
Production Environment: Linux Debian or Centos, tomcat 5.5
Tools: Intellij and Flash Builder
This is a pretty open ended question, with no real "right" answer, I think. One thing that can help enable working independently on different layers is to first design a contract/interface between the layers. Then you can work on both layers independently, on one side working to fulfill the contract/interface, and on the other side working to build on the data/functionality provided by the contract. You can start out with some kind of mock implementations of the contract/interface on one side, and a mock consumer of the data/functionality on the other side. This can work within your Java/Spring/Hibernate/MySQL backend as well as across the backend and frontend. You're still going to have times where you need to actually integrate your layers and test that integration, which will create dependencies between the completion of work in different layers.
I started as a software engineer at the company I'm currently at. Over time, I was either the only one willing to or capable of taking responsibility for various systems, and so I was "promoted" to being IT Manager. Now, during my time as software engineer, I would create functional tests for the various software modules I would build, and as a result, even today I am able to quickly test various parts of the system that I have worked on. However, there is a large large code base with little to no coverage from the other various developers who have been working here.
Now, as IT Manager, I want to be able to test that all the parts of the system are working, but there is:
A) no budgeted time dedicated to creating code test coverage
and
B) No desire from the "chief software engineer" to start creating testing suites to help me monitor that the software is functioning.
I don't expect the software team to drop everything they are doing and spend 2 weeks creating test suites, but it would be nice if they started expanding the test suite
coverage over time so I can confirm that the various parts of the system are working.
So boiling it down, how do I get the software team to start building test suites?
Other caveats:
A) I'm still asked to do software projects in addition to managing our IT dept (a unix engineer, desktop support guy, and related office and production equipment)
B) My unix admin has a really hard time getting production systems up running the full code base, and we aren't getting good help from the software team. He can't run any kind of diagnostic to see where the web app is failing on the new installs. The VP of the company keeps telling me to go in and do print_r's in the code to see what is happening. This sucks!!!
First, you need to investigate Test Driven Development so that you are comfortable explaining it in terms that your developers will understand, as well as your management. Since you seem to be developing web applications, and you have technical skills, I suggest that you take the plunge and choose an open source tool for testing web applications, get it installed, and start building tests for anything that you develop yourself.
Twill is an example of the kind of testing tool that you would need.
Then, as manager, you need to entice developers to follow your example, and reward them for doing so. And punish them, when they don't use the testing framework and it leads to preventable problems. As soon as you get one such incident, you should be able to get your boss on board, and pick up some momentum.
Overall, remember that the objective is to do less work to get a good result. Cutting corners is a way of doing less work, but leads to the risk of bad, or spectacularly bad results. Keep management informed of the risk levels and potential costs at risk.
Don't just force people to do testing for testing's sake. It has to help them be more productive so choose the first projects for it carefully.
That's a good question. And if there was one correct answer to it, much more software projects would be successful and deliver high quality.
I don't think, that it is a good idea to make such a change top-down. It has to be driven from the developers themselves. So trainings in TDD direction would be good, but that is a long time invest, which takes time.
If you want a faster solution you should consider functional-, acceptance-, and systemtests. With these test you test pretty much the whole application through all layers. If you are developing web applications you should consinder using Selenium to automate your test. It is easy to create test with it (Selenium IDE).
But using only such tests (not Unit-tests) don't give you the advantages coming from TDD.
Automating your tests is crucial.
Do you have a Test or QA team?
I would first start to see if they have Test Cases that they use to qualify the build. If not you will have to develop these test cases to test the core functionality of your product.
The next step would be automating the test cases.
If the application is poorly developed without any troubleshooting tools or debugging features it would be tough until these are added as requirements for next release.
My 2 cents.
I'll have to disagree with michaelkebe- these changes need support from the executive level, in addition to a few key developers, in order to fully succeed.
Without that support, you'll just be some developers who look like they are 'wasting time on writing tests for stuff that already works.'
There needs to be a clear vision, and it needs to be repeated loudly and often.
I'm not necessarily advocating for Agile here, but often times it clicks for business owners.
If you can sell them on that, the things that you're excited about (delivering software fast, easy maintenance, automated testing, etc.) will fall into place.
We're in the initial stages of a large project, and have decided that some form of automated UI testing is likely going to be useful for us, but have not yet sorted out exactly how this is going to work...
The primary goal is to automate a basic install and run-through of the app, so if a developer causes a major breakage (eg: app won't install, network won't connect, window won't display, etc) the testers don't have to waste their time (and get annoyed by) installing and configuring a broken build
A secondary goal is to help testers when dealing with repetitive tasks.
My question is: Who should create these kinds of tests? The implicit assumption in our team has been that the testers will do it, but everything I've read on the net always seems to imply that the developers will create them, as a kind of 'extended unit test'.
Some thoughts:
The developers seem to be in a much better position to do this, given that they know control ID's, classes, etc, and have a much better picture of how the app is working
The testers have the advantage of NOT knowing how the app is working, and hence can produce tests which may be much more useful
I've written some initial scripts using IronRuby and White. This has worked really well, and is powerful enough to do literally anything, but then you need to be able to write code to write the UI tests
All of the automated UI test tools we've tried (TestComplete, etc) seem to be incredibly complex and fragile, and while the testers can use them, it takes them about 100 times longer and they're constantly running into "accidental complexity" caused by the UI test tools.
Our testers can't code, and while they're plenty smart, all I got were funny looks when I suggested that testers could potentially write simple ruby scripts (even though said scripts are about 100x easier to read and write than the mangled mess of buttons and datagrids that seems to be the standard for automated UI test tools).
I'd really appreciate any feedback from others who have tried UI automation in a team of both developers and testers. Who did what, and did it work well? Thanks in advance!
Edit: The application in question is a C# WPF "rich client" application which connects to a server using WCF
Ideally it should really be QA who end up writing the tests. The problem with using a programmatic solution is the learning curve involved in getting the QA people up to speed with using the tool. Developers can certainly help with this learning curve and help the process by mentoring, but it still takes time and is a drag on development.
The alternative is to use a simple GUI tool which backs a language (and data scripts) and enables QA to build scripts visually, delving into the finer details of the language only when really necessary - development can also get involved here also.
The most successful attempts I've seen have definitely been with the latter, but setting this up is the hard part. Selenium has worked well for simple web applications and simple threads through the application. JMeter also (for scripted web conversations for web services) has worked well... Another option which is that of in house built test harness - a simple tool over the top of a scripting language (Groovy, Python, Ruby) that allows QA to put test data into the application either via a GUI or via data files. The data files can be simple properties files, or in more complex cases structured (something like YAML or even Excel) data files. That way they can build the basic smoke tests to start, and later expand that into various scenario driven tests.
Finally... I think rich client apps are way more difficult to test in this way, but it depends on the nature of the language and the tools available to you...
In my experience, testers who can code will switch jobs for a pay raise as developers.
I agree with you on the automated UI testing tools. Every place I've worked that was rich enough to afford WinRunner or LoadRunner couldn't afford the staff to actually use it. The prices may have changed, but back then, these were in the high 5-digit to low 6-digit price tags (think of the price of a starter home). The products were hard to use, and were usually kept uninstalled in a locked cabinet because everyone was afraid of getting in trouble for breaking them.
I worked over 7 years as an application developer before I finally switched to testing and test automation. Testing is much more challenging than coding, and any automation developer who wants to succeed should master testing skills.
Some time ago I put my thoughts on skill matrices in a couple of blog posts.
If interested to discuss:
http://automation-beyond.com/2009/05/28/qa-automation-skill-matrices/
Thanks.
I think having the developers write the tests will be of the most use. That way, you can get "breakage checking" throughout your dev cycle, not just at the end. If you do nightly automated builds, you can catch and fix bugs when they're small, before they grow into huge, mean, man-eating bugs.
What about the testers proposing the tests, and the developers actually writing it ?
I believe at first it largely depends on the tools you use.
Our company currently uses Selenium (We're a Java shop).
The Selenium IDE (which records actions in Firefox) works OK, but developers need to manually correct mistakes it makes against our webapps, so it's not really appropriate for QA to write tests with.
One thing I tried in the past (with some success), was to write library functions as wrappers for Selenium functions. They read as plain english:
selenium.clickButton("Button Text")
...but behind the scenes check for proper layout and tags on the button, has an id etc.
Unfortunately this required a lot of set up to allow easy writing of tests.
I recently became aware of a tool called Twist (from Thoughtworks, built on the Eclipse engine), which is a wrapper for Selenium, allowing plain English style tests to be written. I am hoping to be able to supply this to the testers, who can write simple assertions in plain English!
It automatically creates stubs for new assertions too, so the testers could write the tests, and pass them to developers if they need new code.
I've found the most reasonable option is to have enough specs such that the QA folks can stub out the test, basically figure out what they want to test at each 'screen' or on each component, and stub those out. The stubs should be named such that they're very descriptive as to what they're testing. This also offers a way to crystalize functional requirements. In fact, doing the requirements in this fashion are particularly easy, and help non-technical people really work through the muddy waters of their own though process.
The stubs can be filled in via a combination of QA/dev people. This allows you to CHEAPLY train QA people as to how to write tests, and they typically slurp it up as it furthers their job security.
I think it depends mostly on the skill level of your test team, the tools available, and the team culture with respect to how developers and testers interact with each other. My current situation is that we have a relatively technical test team. All testers are expected to have development skills. In our case, testers write UI Automation. If your test team doesn't have those skills they will not be set up for success. In that case, it may be best for developers to write you UI automation.
Other factors to consider:
What other testing tasks are on the testers' plate?
Who are your customers and what are their expectations related to quality?
What is the skill level of the development team and what is their willingness to take on test automation work?
-Ron
I am developing applications for 9 years now - meanly Java. Now am asked to participate in the SVT team for the next release. Overall this means installing complex system setups and running specific user scenarios on these setups as well as doing long runs and load runs.
Overall I am positive about it as I will learn something new. But I am also affraid to loose some grip and knowledge with programming, because of not doing it a lot then.
I know doing programming in side projects such as helping with open source projects will be one alternative, but finding the time on top of a familiy life and a fulltime jop is not that easy.
What do you think, is doing concrete testing work helping getting a better software engineer?
Thanks in advance,
Michael
Testing isn't asside of programming.
You can still program automated systems so you can have recursion testing. From unit tests to real complex automated systems, the best i know is selenium which generates code you can use to build testing scripts in most languages.
There are other tools for non webapps. But I personaly believe that testing is a bit far away from "stoping coding. Unless you're just doing user point-of-view testing.
You can also do error injections which will make you write small singletons to inject them in the memory of your application.
So you can code while testing ;) and learn new stuff also.
Having been in a testing team i think it really helps, because you'll learn to exploit code easily, which will reflect when you build your own API or App at a later date.
I would say it depends on your skill and temparament. Programming knowledge will serve you well while testing. At the same time, I know that it needs a different approach and mindset and is on a completely different career track. You can always keep up your programming skills by writing code for a project you like (even if you have to make one up).
I am tasked with improving the performance of a particular page of the website that has an extremely high response time as reported by google analytics.
Doing a few google searches reveals a product that came with VS2003 called ACT (Application Center Test) that did load testing. This doesn't seem to be distributed any longer
I'd like to be able to get a baseline test of this page before I try to optimize it, so I can see what my changes are doing.
Profiling applications such as dotTrace from Jetbrains may play into it and I have already isolated some operations that are taking a while within the page using trace.
What are the best practices and tools surrounding performance and load testing? I'm mainly looking to be able to see results not how to accomplish them.
Here is an article showing how to profile using VSTS profiler.
If broken it is, fix it you should
Also apart from all the tools why not try enabling the "Health Monitoring" feature of asp.net.
It provides some good information for analysis. It emits out essential information related to process, memory, diskusage, counters etc. HM with VSTS loadtesting gives you a good platform for analysis.
Check out the below link..
How to configure HealthMonitoring?
Also, for reference to some checklist have a look at the following rules/tips from yahoo....
High performance website rules/tips
HttpWatch is also a good tool to for identifying specific performance issues.
HttpWatch - Link
Also have a look at some of the tips here..
10 ASP.NET Performance and Scalability secret
Take a look at the ANTS Profiler from Red Gate. I use a whole slew of the Red Gate products and am very satisfied!
There are a lot of different paths you can go down. Assuming a MS environment you can leverage some of the team system tools such as MS Team Tester to record tests and perform load testing against your site. These can be set to run as part of an automated build process.
A list of tools is located at: http://www.softwareqatest.com/qatweb1.html#LOAD
Now, you might start off simple. In this case install two firefox plugins: Firebug and YSlow for Firebug. These will give stats and point out issues such as page size, the number of requests made to get the page, etc. They will also make recommendations on some things to fix.
Further, you can use unit tests to execute a lot of the code behind to see what functions are hurting you.
You can do all sorts of testing if u have full MS dev system with TFS and Visual Studio Team Edition. Based on what I see here
I recently had a nice .Net bug which was running rampant. This tool sorta helped, but in your case, I could see it working nicely..
http://www.jetbrains.com/profiler/
Most of the time we've used WCAT from Microsoft. If your searches where bring up ACT then this is probably the tool you want to grab if you are looking for requests per second and the such. Mike Volodarsky has a good point pointing the way on how to grab this.
We use it quite a bit internally when it comes to testing our network infrastructure or new web application and it is incredibly flexible once you get going with it. And it seems every demo Microsoft has done for us with new web tech they seem to be busting out WCAT to show off the improvements.
It's command line driven so it's kinda old school, but if you want power and customization it can't be beat. Especially for free.
Now, we use DotTrace also on our own applications when trying to track down performance issues, and the RedGate tools are also nice. I'd definitely recommend a combination of the two of them. They both give you some pretty solid numbers to track down which part of your app is the slowdown and I can't imagine life without DotTrace.
Visual Studio Test Edition (2008 or 2010) comes with a very good load testing component for ASP.NET apps.
It allows you to get statistics for all the perfmon stats for a server (from basics like CPU and disk waits to garbage collection and SQL locks)
Create a load test for the page and run it, storing the stats in a database for the base line. Subsequent runs can be compared.