Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I have a developer on my staff that chronically overshoots deadlines, and estimates. On several projects the last week or two everyday I hear "It should be done by the end of the day". This developer does good work.
I have already spoke to him about his problems. He seems genuinely frustrated, and miffed about what to do to correct them.
My Questions are:
What kinds of punishments for passing a deadline are effective?
What ways can I coerce this employee to police his actions (time estimates, etc.,) himself?
UPDATE:
Based on the responses; here's what I have figured out.
Punishment is a bad idea.
It is natural for an employee to be unable to fix estimating problems without intervention.
Don't make deadlines unless there's company consequences (lost contract) for not being done by then.
Utilize available methods (Agile, Joel's checklist) to help the developer estimate better.
Thanks for the links and information. Also thanks for updating my thinking.
I don't think the problem is that he is missing these deadlines.
I think he has a real problem in estimating the amount of time it will take to complete a task.
Have him start keeping a journal of what he says a task will take and how long it actually took him to complete the task. Eventually, this journal will become a sort of guide for him to create better estimates. Once he becomes better at estimating, he shouldn't feel as rushed or harried.
There is an interesting article by Joel Spolsky: Evidence Based Scheduling
1) Break ‘er down
When I see a schedule measured in days, or even weeks, I know it’s not going to work. You have to break your schedule into very small tasks that can be measured in hours. Nothing longer than 16 hours.
This forces you to actually figure out what you are going to do. Write subroutine foo. Create this dialog box. Parse the Fizzbott file. Individual development tasks are easy to estimate, because you’ve written subroutines, created dialogs, and parsed files before.
If you are sloppy, and pick big three-week tasks (e.g., “Implement Ajax photo editor”), then you haven’t thought about what you are going to do. In detail. Step by step. And when you haven’t thought about what you’re going to do, you can’t know how long it will take.
Setting a 16-hour maximum forces you to design the damn feature. If you have a hand-wavy three week feature called “Ajax photo editor” without a detailed design, I’m sorry to be the one to break it to you but you are officially doomed. You never thought about the steps it’s going to take and you’re sure to be forgetting a lot of them.
The main point is that he (and you) should learn from his mistakes, and take them into account on the next estimation.
Also, if you are a developer, I would do regular code review at the end of the day to get a better insight into his development process.
And, of course, smaller iterations and more granularity with tasks. Set the maximum task duration to 1 day. That's the rule we have.
If your first question is
what kind of punishments to be considering
I think you're on a loser straight off. If you feel he does good work you may have to look at the deadlines/estimates and see if they were realistic in the first place. Who set them, if the developer in question was not involved then that may be part of the problem.
I agree with #OTisler that pair programming and possibly a regular end of day progress review with yourself can help him through... although if the deadlines/estimates were unrealistic to begin with thats not where your problem lies.
Closer monitoring on a few specific tasks should highlight where any issues lie.
What kinds of punishments for passing
a deadline are effective?
None. If you anger him, he won't do the work, or he'll find another job. You should help him figure out why his estimates are off. There is a book by steve McConnell about making estimates. i would start there.
What ways can I coherence this
employee to police his actions(time
estimates, etc.,) himself?
By helping him find the right way to make estimates.
First, make sure you are crystal clear in your requirements.
I hate to say it, but in my experience, blown deadlines are just as often a matter of unclear requirements or weak specifications on the part of a supervisor. First thing to do is to make sure the problem isn't either originating with, or exacerbated by, you.
Also, make sure your requirements are realistic, as well as his estimates.
Make sure that your own expectations aren't pushing him to make unrealistic estimates in order to meet unrealistic requirements.
Remember, you do the requirements, but the developer ALWAYS does the estimates, and should not be swayed with "can we do this any faster" unless you are also specifying functionality to be dropped.
Then, make sure he is tracking his time/tasks accurately, so you can get a good view of what is going on with the project.
This process will show any lack of proper time/task tracking, which may end up being the first step to improvement. If you can't see after the project how long a particular item took, that is probably the cause of the problem right there - not enough definition in the estimate, or missing "dependency" tasks that are discovered mid-project, but never estimated.
You HAVE to know how much time was spent doing what, accurately, before you can find out where the creep was, or what can be done about it.
Then, see where his estimates are failing and figure out why. Go over an estimate of a blown project, make that into a project itself - a problem to be solved.
Once you've determined that his estimates are indeed the source of the problem, go over an estimate that went over with him, and perhaps another developer, and figure out why.
This will help you figure out what the cause of the problem is. A solid understanding of the problem will likely be the actual solution.
Lastly, if you actually reach a point where you have to try punishment or coercion, it's time to fire him and start over.
Punishment and Coercion are appropriate responses to willful wrongdoing in certain situations.
However, if this developer is actively trying to do a good job, then you would only worsen the situation by generating negative attitude and frustration.
If the problem can't be solved, and you are sure the problem is with him, and not you, then it's time to fire him and get a developer who can meet deadlines. Great work doesn't mean much when your costs are blown up and profit goes out the window.
Okay, this is fairly common--developers being optimistic. It's the job of Management to deal with it. If anyone should be punished, it's the manager (you?)
I'm glad you at least asked, It looks like you got some good answers off this list, I hope they help and you find a way to actually implement some that work.
When I was young, my first good manager dealt with it this way:
First of all, he had me come up with an itemized list--breaking tasks down to hours, and estimating each one with a very liberal estimate--no period should be less than 4 hours regardless of how small the task was.
Then he looked at them and told me to double all my estimates. (Developers, especially younger developers, don't think about the fact that you are only productive for about 1/2 the day, if you're lucky--and half of that is spent at things you didn't expect to have to do).
Then, before creating his schedule, he doubled all my estimates (Without telling me).
He turned them in this way regardless of schedule requirements from above. A good manager should realize that saying it needs to be done in 2 days, doesn't make it possible.
As I got better at estimating we both noticed and adjusted accordingly.
A managers job isn't just to make a project, it's to build a team. More often than not that's going to require training of some sort. This is also the reason that an engineering manager that is not an engineer is unacceptable, they can't really help with this kind of thing.
Failure of a project or schedule is VIRTUALLY NEVER the fault of the developer (except in a few chronic cases where he isn't really fixable or of any worth and needs to be fired). The manager has made bad decisions either in hiring the developer, trusting him, managing him or staffing the project.
And really, what is fault anyway? I suppose if the manager isn't very good at making the project happen, he's going to need someone to point at... If HIS manager is any good, he'll ask why it got this far, what you did to fix it, etc.
Hiring a manager is hiring someone to solve the problems. To make the developers productive. If he can't make them productive, he isn't the right person.
To your questions:
If you choose to punish people for missing deadlines you will not get good results. They will be demotivated and feel belittled. If you keep pushing people to meet deadlines the quality of work will suffer and you will end up with a lot of time spent bug fixing afterwards.
To improve his time estimates you could try using Joel Spolsky's evidence based scheduling which has a nice feedback loop to improve the resulting estimates.
But I have some questions that I think you need to think about.
Is he later than everybody else? If so why - is it because he is an over optimistic estimator or a slow worker? Over optimistic estimates are easy to fix - just multiply all his numbers by a factor as per evidence based scheduling above. If he is a slow worker why? Does he get distracted? Is he very careful to produce very low defect code? Is he over engineering solutions? Is he not re-using code effectively?
Do the deadlines matter, or are they just arbitrary dates based on the estimates for the purposes of reporting progress up the management hierarchy? If the latter you can solve this by tweaking his estimates yourself.
What kinds of punishments for passing
a deadline are effective?
You stated the point and missed it. The obvious punishment for passing a deadline is death. If the developer is still alive after passing a deadline the "deadline" obviously was not a real deadline. Do you think it's funny to put developers under pressure using martial language?
Fix your wording.
Motivation
First of all: Read Peopleware
Next. Why do you think punishment will be an effective way to manage people that is supposed to be creative? I think you have to rethink the whole approach to management vs. team.
As I see it the managers first, and most important, role is to make sure that the developers can be creative and productive. Not that they are productive. There is a big difference in those small words. To be creative you need a safe environment. By being constantly under pressure from both deadlines and threats of punishment you create the exact opposite of safe.
Also, as a manager, you need accurate information on which to base decisions. This also requires a safe environment. If there is a risk for punishment for being honest and outspoken you are guaranteed to get lies and absence of information. A very dangerous base to take decisions from.
Estimates
As other as pointed out, estimates are estimates. In our team we don't do any individual estimates at all, we do estimates as a team. (I'm a bit reluctant to call what we do Scrum, but most of it tries to emulate if nothing less) I think this is a really great way to do estimates: Each team member is given a deck of cards consisting of numbers 0,1/2,1,3,5,8,13,20,40,60,100 and when estimating a task each developer picks a card (the cards are hidden until everyone has picked a card to avoid influencing estimates) and the average of the selected cards is taken as the estimate.
Notice how the numbers gets progressively less accurate. This is by design because large estimates are by necessity less accurate.
For our team we have opted to use the unit "ideal man days" for estimates. As far back as any of us can remember an ideal day hasn't occurred yet, but it is a good basis when you know how to translate calender days to "ideal man days".
As Scrum prescribes, development is done in sprints of two weeks after which the new version is deployed in the production environment. After each sprint we take the sum of the estimates of the completed tasks and divide that by the planned man days for the sprint. This factor is then the basis of estimating how many "ideal man days" the team can spend in a two week period.
Actual work items done by an individual developer don't need an estimate. The first approximation is always 1/2 - 1 day to complete. If this estimate turns out to be false you just grab a fellow developer and do it together to get it done. Or you break down the work item in smaller tasks so it can be distributed better.
Set Milestones and try Agile as #OTisler suggested.
I don't think you should punish him. Just get him to understand how to make accurate estimates.
As a team lead I've had my team members tell me that it will be "no problem" to finish X feature by the deadline. Then I usually sit down with them and go over what tasks and sub-tasks I think need to be done in order for the feature to be finished, and how long the developer thinks each will take.
After we do this exercise, and add up all the task and sub-task estimates, it will inevitably take much longer than the developer thinks in their original estimate. I usually only have to do this exercise with them a few times before they start making more accurate estimates.
What amazes me is that you only have one of these guys.
Engineers are horrible at estimating how much time something will take. I bet if you look carefully at your other developers' estimates, you'll find a lot of padding. Sometimes the padding isn't necessary, but the task expands to fill the available time anyway.
The solution to this is to change around how you do estimates - for everyone. Developers may be bad at estimating absolute time, but they're pretty good at relative time. So on Monday, instead of "how long will it take to add a whoosiwhatsit?," ask "what can you get done on the whoosiwhatsit in less than a week?" That becomes their task for the week.
The following Monday you look at how it went. "Well, I got the floogle installed in two days but it turns out it impacted the mcphee...so this week I need to decouple those guys so the whoosiwhatsit files don't get overwritten." Ok, there's their task for the week.
You might think it won't help, because you still don't know when the whoosiwhatsit is going to be ready. That's true. You have two choices here:
If you need a deadline, then you have to force your errant developer to pad his estimates like everyone else. It won't take him long to get the hang of it, and in no time at all he'll be taking "2 weeks" to write something that should have taken a day.
Your other choice is to trade the fictitious estimates for more visibility. In the long run this approach gets you more productive and much happier engineers.
So the developer does good work, but is poor at estimating the amount of time for delivery? I'm not sure you have a punishment situation on your hands just yet.
Maybe going forward for some time, have him walk you through his process for estimating a delivery point. This can be an opportunity to ask him why steps X,Y, and Z take certain amounts of time. He may find himself revising his estimates simply by doing the exercise at what is almost certainly a slower pace.
ask yourself this: What entails your job?
If you're just blindly passing estimates from developers (who you know can't give good estimates) up the management line, and not deciding for yourself whether that estimate is achievable, then you're not doing your job.
Try to think in terms of "value-add" (One of my old employers used that term a lot , and I hated it, but it probably works for you in this situation). What value are you adding? If you're just passing stuff in both directions between upper management and the developers, then ultimately you're not earning your money. You could be removed , and nothing would change.
The best manager I ever had was one that looked through a set of requirements given to him by another team , and told them straight out that almost a third of them was bull, and had them removed, before I ever even saw the list. The worst one I ever had made me write all this extra management-type documentation which none of the other managers I'd ever had asked me to do (I really got the impression I was literally doing his job for him), didn't even give me project due dates, and hardly turned up to work. They were both in the same company , bizarrely enough.
90 hours is one common short project deadline. The easy way is instead of estimating "your time", you measure another. Computer programmers shoudn't make time estimates for their projects since evidence shows calculating one's own time results in larger error than observing another.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
How can you stop the dev team from padding their numbers when it comes to creating task times? How is there any motivation for them to do their work if there are no real deadlines and they are just measured against their velocity.
Get the job done by this deadline
vs
Get the job done whenever we will reduce scope, quality or increase resources
The foundation of most agile methods is trust. If you don't trust your team, why have you hired them in the first place? If you think they are not up to the task, it is better not to start the project because it is almost surely doomed to failure - either because you are right and the team is a bunch of inept developer,s or because you are wrong but your lack of trust and overcontrolling suffocates the team and actually saps their commitment and enthusiasm.
OTOH if you are sure that you have hired the best guys available, and that they are talented and motivated, the best is to give them a good challenge, let them work and try to remove all obstacles from their way.
In my experience most developers start out enthusiastic and motivated to create products they can be proud of. However, impossible deadlines, irrealistic expectations, too much bureaucracy, overcontrolling management - and last but not least, reducing quality - can quickly kill that motivation.
The point of agile methods is that developers are the right people to know / estimate the cost of a specific feature. If management insists on estimating both the resources, scope and time allotted to the project, it almost always results in disaster. If OTOH the developers are given trust and responsibility, they will usually live up to the task. In Scrum, the team together works out the estimates and fights problems / issues coming up during the sprint. In a good projects, team members quickly gel with the team and they feel personal responsibility for the project. This can go as far as poking laggard members to produce results instead of pulling the team back.
In my experience, developers pad numbers due to uncertainty. Are your product owners clear in their business requirements? Are the stories sufficiently small to estimate against?
Your 2 choices indicate to me that you have a fundamental misunderstanding of Scrum. The promise of Scrum is not getting the same result, just faster. It is the ability to iterate quickly, respond to feedback and alter course. The foundation of Scrum is the self running team. If you don't have teams you trust, Scrum probably isn't for your organization.
As Péter said in his response, team motivation is key and the quickest way to kill that is to undermine them. If the team feels management doesn't support or believe in them they will have no reason to make aggressive estimates and will simply cover their own butts. It's your job as a manager to help them succeed.
Additionally, the agile methodologies promote responsibilities to the peer. You shouldn't have just one person estimating items. Make it a group thing, and make sure (mostly) everyone agrees to the estimate. If you have your whole team colluding against you to pad estimates, you have a lot more problems than you think.
Besides, there's a lot of opportunity to pad estimates and make excuses in traditional waterfall / BDUF (big design up front) effort. I would say that scrum, with the daily scrum meetings, helps to contain this more than it helps to promote it.
I wrote a blog post a while back on estimation anti-patterns. It's a funny read but sadly all of the patterns are things I've either seen done or heard of from colleagues. We've gone through about 3 of them on our current project; I don't think there's any team which manages to avoid all of them entirely!
http://lizkeogh.com/2009/11/30/estimation-anti-patterns/
Also have a look into systems thinking, game theory and perverse incentives. If the devs are padding the estimates it's because the environment they're working in is encouraging them to do so. Changing that environment will help them.
Good answers on this one already. Basically, if you assume developers will cheat on you by reporting hours they didn't in fact work (but spent playing whichever MMORPG is now en vogue) why you even work with them in the first place? And if you trust them, why you think they "pad"?
BTW - it is completely normal for teams that are new to Scrum to first overestimate (and have to drop items from Sprint), then - getting thus burned - underestimate to avoid that happening to them again, then as others have pointed out team velocity will level out and people will have a better understanding of how much they can do within a sprint.
One more hint as to what you can do: don't question their hours, don't try to "manage" them by telling them how much time this or that should take. Rather, ask them how they want to achieve this or that, what solutions they want to use and why? It is quite common with good geeks that they tend to over engineer - if things look way bigger than you expected probably there is a misunderstanding here about what is being built that needs to be cleared.
It is actually quite impossible to "pad" estimates in scrum. After a few sprints the velocity will average out and the team will "know" how many points it can commit to. There is nothing to pad.
I think you have your statements backwards because
Get the job done whenever we will reduce scope, quality or increase resources
is an exact description of Waterfall; not scrum. In scrum we have deadlines, it is called the end of the sprint. In scrum we NEVER sacrifice quality because we know that will cost us more in the long run. In scrum we don't add resources because we know that people gel and form a "team" and upsetting that balance is detrimental to productivity.
Why are you bothering at all with task times? The only time I have seen good developers pad estimates is if they are forced into giving an estimate for an unknown feature. We don't do this in scrum. We know what the conditions of acceptance of a feature is before we commit to it.
One approach to avoid this situation entirely is to estimate in story points, i.e. ask the question "does this story require more or less work compared to that story?".
In my experience developers rarely pad their estimates. The general tendency of developers is actually to underestimate complexity and effort.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
From wikipedia:
During each “sprint”, typically a two
to four week period (with the length
being decided by the team), the team
creates a potentially shippable
product increment (for example,
working and tested software). The set
of features that go into a sprint come
from the product “backlog,” which is a
prioritized set of high level
requirements of work to be done. Which
backlog items go into the sprint is
determined during the sprint planning
meeting. During this meeting, the
Product Owner informs the team of the
items in the product backlog that he
or she wants completed. The team then
determines how much of this they can
commit to complete during the next
sprint. During a sprint, no one is
allowed to change the sprint backlog,
which means that the requirements are
frozen for that sprint. After a sprint
is completed, the team demonstrates
the use of the software.
I was reading this and two questions immediately popped into my head:
1) If a sprint is only a couple of weeks long, decided in a single meeting, how can you accurately plan what can be achieved? High-level tasks can't be estimated accurately in my experience, and can easily double what seems reasonable. As a developer, I hate being pushed into committing what I can deliver in the next month based on a set of customer requirements. This goes against everything I know about generating reliable estimates rather than having to roughly estimate and then double it!
2) Since the requirements are supposed to be locked and a deliverable product be available at the end, what happens when something does take twice as long? What if this feature is only 1/2 done at the end of the sprint?
The wiki article goes on to talk about Sprint planning, where things are broken down into much smaller tasks for estimation (<1 day) but this is after the Sprint features are already planned and the release agreed, isn't it? Kind of like a salesman promising something without consulting the developers.
BTW:
Although the word is not an acronym,
some companies implementing the
process have been known to spell it
with capital letters as SCRUM. This
may be due to one of Ken Schwaber’s
early papers, which capitalized SCRUM
in the title.
You are supposed to use the velocity to plan the next sprint. The velocity neatly handles the fact that your estimates are wrong, but they are consistently wrong. Also note that stories are supposed to be short, I'd say maximum 2-3 days. Stories that are bigger than that should be broken down into smaller stories.
If one story is not completed as planned, then your velocity goes down and you wont be able to take on as much work in the next iteration.
The wiki article goes on to talk about Sprint planning, where things are broken down into much smaller tasks for estimation (<1 day) but this is after the Sprint features are already planned and the release agreed, isn't it?
Wrong, they are done in the same meeting. The sprint stories are not agreed upon until everyone leaves the sprint planning meeting. Whatever questions you need to ask the PO to enable your commitment to the stories; you do before or in the SP meeting
Since the requirements are supposed to be locked and a deliverable product available at the end, what happens when something does take twice as long? What if this feature is only 1/2 done at the end of the sprint
The functional objective of the story is locked, not the implementation details. The details come out in conversation during the sprint. Any details considered to large to be contained in the current sprint scope are put back on the Backlog for later prioritization. Remember, you are building incremental products here. Its like peeling an onion. The story must be satisfied and the code must be working at the end of the sprint. That doesn't mean the whole feature is entirely complete and releasable to a user.
If a sprint is only a couple of weeks, decided in a single meeting, how can you accurately plan what can be achieved? High-level tasks can't be estimated accurately in my experience, and can easily double what seems reasonable. As a developer, I hate being pushed into committing what I can deliver in the next month based on a set of customer requirements, this goes against everything I know about generating reliable estimates rather than having to roughly estimate and then double it!
You are correct here, you can't estimate accurately. Scrum embraces this fact and uses velocity, trending, averaging, and gut-feel to get close. If you don't come to grips with forgetting about accurate hour increment measurements you won't ever feel comfortable with scrum.
In answer to #2, if a feature isn't done at the end of the sprint, you don't deliver it. You may be able to deliver part of it, and if you can do so in a useful fashion, do so. But if you can't deliver it this sprint, remove it, and deliver it in the next sprint.
In answer to #1, there are numerous ways to try to improve the accuracy of your estimates. You might use function-point analysis or just a simple exercise where the entire project team takes the list of tasks separately and comes up with their own estimates for each task, then reviews each task and shares their estimates, and discusses the reasons why (for instance) Bob's estimate for this task is 8h and Tina's is 16h. The team figures out who is right (hopefully) or comes to consensus, and uses that as the estimate.
Over time, you'll come to learn which of your estimates tend to be overly optimistic, and which are overly pessimistic, and thereby improve your ability to estimate your own tasks.
The burn-down chart can really help you here. It is an early warning system for the whole project team, to let you all know when one or more people are falling behind. Then the team can reorganize to help make the sprint commitment if necessary, or cancel the sprint due to unforeseen circumstances, and kick off a new sprint with their improved understanding of the problem space.
Finally, you might consider that statistics are on your side. If you overestimate 10% of the tasks, and underestimate 10% of the tasks, you'll probably be OK.
When we did SCRUM in an earlier project, we first agreed on a rough sprint plan including high-level features (stories), then refined the plan by breaking each of these down to groups of concrete tasks of preferably 1 days max length, estimating each task. After this we often found out that the original plan was overcommitted to some extent (typically because we didn't take into account that developing a story includes unit testing, code review and documentation too), so we adjusted it accordingly. Btw we used "estimation poker": each member chose a card with a number on it (work hours/days) and everyone showed his card to the count of 3. If the numbers differed a lot, we briefly discussed why, and then had a new round until we reached near consensus.
Note also that estimation is very domain and technology dependent. In that project, we understood both fairly well, and we were building a new app from scratch, so our estimations were fairly accurate. In my current project we are working with legacy code, in a domain we don't quite understand yet, so our estimates are often wildly out of range.
As the project rolls on, estimates are gradually getting better (related to the fact that more and more risks and tricky issues are being resolved, and the team's domain expertise grows), so the velocity of the team can actually grow over time.
in answer to #1, I'm not sure I agree with some built in assumptions in the question. In my experience, Agile (including Scrum) is not about time estimates. The whole idea is to move away from that and instead move to a system where you have a known velocity and specific sprint times. For instance, you release every 2 weeks (a good sprint time) with some new code. You see how many story points (not time units, but story points) you get done over a sprint, and then another, and after you've done a few sprints you know your rough velocity (ie, how many story points you can do on average per sprint).
The idea is that the customer gets continuous updates to the application as each sprint is finished and can see constant progress. They know which items are scheduled to come in future sprints but they are aware that if something slips (because of an incorrect difficulty rating, ie. the story point estimation, or an outside problem) it will instead come in the next sprint and everything else will be moved out a little.
So its not about developing the software based on some seemingly arbitrary estimation. Instead, its about planning what functionality or features you want and assigning difficulty (story points) to those features (relative to the other features) and working through them to determine a velocity. Only then can rough estimates be obtained. Once the average velocity is known, we can make some rough guesses about time frames. However, even these should be considered rough approximations because again, its not about time, its about constant feature releases. Clearly, this mindset must exists with EVERYONE on the team, not just the engineers.
Here is a link (wikipedia) that goes into it a bit more.
Anyway, I hope this helps you, good luck!
It's actually quite simple:
You priorize the Tasks and if you see that you don't have enough time then the low-priority tasks simply get dropped or moved into the next sprint.
The Project Owner decides what he wants and sets the priorities and you develop following that order. You should have a useable product at the end of the sprint, not the fully-featured product.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
In a scrum team, how important is it to complete a single story before moving on?
Our scrum master is fairly dogmatic about bringing a single story to completion before moving on. I can see that development would appear to be more "controlled" in this scenario, plus the scrum master would have a very accurate picture of what team members were working on at any given time... but I am interested in what this really buys us?
Clearly the scrum master wants to minimise divergence of the burndown from reality to avoid a shock come the end of the sprint - but surely if the sprint is two weeks long, the burndown is updated consistently and blockers are communicated at standups - any such divergence will be constrained by the sprint length, and be made visible mid-sprint through the usual channels (i.e. the standup or speaking to the scrum master individually). Any remaining issues can be dealt with in the fortnightly retrospective.
The reason for the question is that I seem to find I work most efficiently by keeping say 2 (or 3 if one is particularly easy) stories in progress at any given time which I work on as I see fit. This seems to assist with the sub-conscious background thought that assists with completion of the task. It also permits me to better understand the bigger picture if a couple of stories are related.
Our stories usually work out to be one or two days worth of work.
So, is working on a couple stories at a time frowned upon and if so what does one-story-at-a-time buy you?
i think it really is up to the team to decide. i think you hit it in your write up about the burndown, the most important thing is to meet your sprint commitments consistently. how that happens really should be up to the team if they truly are self-governing. the team im on now, our norm is to work on multiple stories at once; its the nature of our setup given that we try to really spread ownership of stories across the team. it may be a different norm for yours if you have shorter stories and more of a individual ownership style.
I personally think one story at a time works well because it keeps you focused on a task. The cost of context switching between multiple stories can be high. This is a personal preference for me, but different people work differently. Though I think your scrum master is correct in his methodology, if you've found very compelling reasons for multiple stories at a time and can demonstrate that it is in fact helping progress, that would be a good case to make.
IMO, there is an underlying question here. Sometimes when working on a story, I'll need something from another department/team,e.g. clarification on a requirement or a graphic for a page, and this means that I won't finish one story before moving on to another story. While you do mention this in discussing the blockers at standup, this can happen where it is up to someone outside to help me finish a story so there can be multiple ones on my plate. Thus, I can have multiple stories due to blocking on something and still wanting to be productive.
In general, I don't like trying to manage multiple copies of the code base or switch my code a lot, so I prefer doing one story at a time, assuming no blockers. The size of the code base I'm working with is ~1.1 GB of data spread over 82,000+ files so having multiple copies could be more than a little painful I'd imagine.
My personal guess on this is that it is up to the team to set the standard and see that it works for them. If some like one story at a time and others do multiple and all is well, cool. If everyone likes having multiple stories at various points of completion, that can work too.
Don't hog the backlog...
In my experience, when stories are between 1 and 2 days in size, they tend to be implemented by a single developer. If you are working 2 or 3 stories concurrently, that might reduce the amount of things in the backlog that other developers can pick from and jeopardize the sprint.
... but plan for blockage
On the other hand, working 2 or 3 stories at once means that if you are momentarily blocked on one story you can be immediately productive on the other. I find that there's some overhead each time I start a new story. This overhead makes it hard to fill an hour long "gap" in my day with a new story, whereas it's much easier to context switch to a story I've already started.
Bottom line, let the team decide... and then review results during a retrospective. If your stories, tasks and work process support an environment where team members can work 2 (maybe 3) stories at a time without sacrificing productivity or predictability, then your SM should respect that. But at the same time you should honestly review the results during each retrospective and be prepared to change if the SM doesn't think its working.
I generally would think that the decision on how to work best should be made almost exclusively by the team. The ScrumMaster's role is to help and support the team, not to question the team's way of working during a sprint.
To be fair, sometimes it can be a good idea for the ScrumMaster to point out possible flaws or risks - that would fall in the category "help and support". Being dogmatic about your personal idea about how a sprint should look like internally is not something I would want a ScrumMaster to act like. It sounds a bit like misunderstading the role as a manager's role, which is simply not the case.
As for how we do it: We almost always work on several stories at once. At the moment we're having a four-person scrum team with three developers and one tester and we nearly always have at least two or three stories going at the same time. In the last sprint we tried to start with all stories early in the sprint to get to the point where we have a basic design and a good idea about what the potential problems could be. Of course we were not working on all stories at the same time after that.
I understand that in terms of risk-management you might want to make sure that everything is done for one story before you tackle the next. However, the disadvantage is that when you run into unforeseen problems lateron, you might not have enough time to fix those. Usually problems show their ugly faces during the implementation phase and often enough quite early. So, you basically exchange one risk for another.
Which risk is easier to handle shoud be up to the team. It's their sprint after all and while I think it's perfectly fair for the ScrumMaster to mention concerns about the way the sprint is going, he shouldn't force his idea of the best way to work on the team.
In the end, I think it boils down to these two things:
YES, we do work on several stories
at once and it has worked out fine
so far.
Remember that the ScrumMaster is
working for the team, not the other
way round.
Please note that I'm mostly talking about the whole team working on several stories at once, not one developer. The problem I'd see there is that you need to make sure that you don't block any stories by keeping them open, so that no-one else can continue the work. Once again, this is a question of circumstance and preference. When it comes to testing, our tester often has a couple of testing tasks for different stories, so that he can easily switch to a different task, if some bug blocks him from continuing testing a feature.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I am the scrum master for a small team of 4 developers. The project we are working on has a lot of tasks that we have never done before and cannot easily estimate in a sprint planning meeting. What is the best way for me to run a sprint with this uncertainty? I am finding it very hard to finish a sprint with a potentially releasable product. I am also finding it hard to plan sprints when there is a lot of unknown length tasks.
I'm not sure what the term is in Scrum, but in User Story terminology you would do a "spike", which is basically a very short period of research into the topic so that your team will be able to estimate the task at the end of the spike.
Example:
Story:
Analyst wants to be able to review
financial data in pie charts.
Your team doesn't use any charting tools, so you need to know how long it would take to build something like this. Or perhaps instead, you could invest in third party tooling and integrate a tooling set with your application.
You would do a spike to research these venues and come up with estimations on them, then decide which route to take.
Are the "tasks" things that someone in the world has done before, or are they just new to your team. I will assume the later. If this is the case then what you are finding is that you do not have the necessary experience on your team to solve the problem. Thus you will be developing that experience as you go. All this means is that the complexity of your stories is higher. In the first couple of sprints you may score some of the stories as 13 and then later on they become 8s because you then have the knowledge you need.
You don't need to know how to do the stories to estimate them. You just need to take on less of them due to your experience gap.
I like to reserve "Spikes" (yes that is the term used in scrum) for attempting to solve business domain problems that have no known solution. Not for the team to do training.
If you really need to do research to get a good estimate, you could do the research as a task in itself, or set it aside and have it done (by someone) before the sprint planning.
Generally, I think that if you can't get a good estimate, you should either go with a bad estimate (i.e. a wild guess) or you should time-box the task, so you set aside a fixed amount of time for it in a sprint. After that, you will either have a done solution, or you will have a better understaning of it so you can estimate it or break it down into subtasks for the next sprint (or a later sprint).
Do you really mean tasks or are you talking about Product Backlog Items (PBIs)? Actually, I find it hard to believe that a task is not estimable. If they really aren't, they are very likely too big (tasks shouldn't exceed 16h, which is already huge).
If you are talking about PBIs, the situation you are describing is quite surprising and should theoretically not happen. In the worst case, just assign them a high number of story points, this precisely means that there is a lot of uncertainty on them. But, because PBIs that are ready for a Sprint shouldn't exceed the half of your velocity (or you'll put too much risk on your sprint), the obvious way to solve this situation is to divide such items into smaller chunks which may include exploration. But the important part is to keep things timeboxed, even (or especially) R&D. Keep in mind that with Scrum, everything is timeboxed.
In other words, to reduce uncertainty, break things down into smaller things (be them items or tasks)!
If the tasks seem un-estimable, I think the best approach would be to break down those tasks into smaller tasks that you can estimate. It might take several iterations, but you will probably come up with a pseudo design while you are at it. Joel mentions this in one of his articles.
Split the unestimatable task into a task to remove the uncertainty, and "the rest". Remove uncertainty with Proof-of-concept tests or spike solutions. Either schedules the spikes this sprint and the rest of the work next sprint, Or delay the start of the sprint for a week of spiking.
We often don't know enough to break a story down into tasks. We have a period of discovery before we know what the tasks will be. "Spikes" seem tricky to manage. For one, you may not be able to time box the discovery period. Second, I can't effectively plan a sprint without knowing how long a story will take.
Seems like another option is to do the spike in Sprint 1 and the tasks in Sprint 2. The downside is that it seems like the process forces an unnatural breakdown of the work. Why discover this week and then wait a while before starting the the work.
We use "contingents" or a specific backlog for such tasks.
The Scrum Tool Agilo is supporting this way of working and calculates those issues too, e.g. in the Burndown. In this way you get a good control on the "un-plannable" items.
Are you confusing precision with accuracy?
The idea behind Agile estimation is to come up with a number that is good enough, not a number that is exact. That is why using story points for backlog item estimation is a best practice; it emphasizes effort/complexity instead of duration.
You don't need to know how long each task necessary to implement a backlog item in a sprint will take. What you need to know is, given the work you've previously committed to in this sprint, can you commit to this backlog item? Because we know that we can't know exactly how much time each backlog item will take, we have to make an educated guess.
More important, what does it mean to fail in Scrum? Is not getting every sprint backlog item completed a failure? No... if you got four out of five items done and the fifth one is mostly done, you'll get credit for the four completed items (in terms of velocity for the sprint), and when you finish the remaining tasks for that fifth item in a future sprint you will get full credit for that item. But, would you have gotten any more done if you weren't using Scrum? The only failure in Scrum is failing to learn from your mistakes, to keep doing the same dysfunctional things repeatedly while expecting different results.
So, in your sprint planning meeting, don't spend a lot of time worrying about something you're not going to be able to know. Let the team think about the work, and then let them sign up for the amount of work they feel comfortable they can complete during the sprint. If they undercommit you can always drag something into the backlog, or end the sprint early. If they overcommit, then you finish the backlog items you can in priority order and discuss why the unfinished items couldn't be finished in the sprint retrospective, along with how to prevent having unfinished items in future sprints.
By the way, I know this was probably a poor choice of words on your part, but an effective Scrum Master doesn't run the sprint. The team runs the sprint, and the Scrum Master actively looks for impediments that lower their productivity and interfere with their ability to meet their commitments. Scrum Masters aren't managers, they're a combination of referee, coach, and scorekeeper. They are the Keeper of the Process, they help the team follow the process, they protect the team from outside agents that try to go around the process, and they track progress during the sprint via ensuring the sprint backlog is updated and the sprint burndown chart reflects reality, on a daily basis. In the situation you've described, where the team isn't sure how much work they should sign up for, the Scrum Master should let the team decide as a reflection of respect for team ownership of the commitment. Whatever the decision is, it won't be wrong.
Spikes should be time boxed. It puts on pressure on the team to limit the scope and have a better idea on the cost-benefits that the research will entail; ie it is useless to carry out 3 days of research for a task which would cost a few bucks.
This is also backed up by Latham's work on Goal Setting Theory where he specifically tackles this issue.
I'll do a demo of my code to slightly non-technical audience, and I need to show them what I've got in my project (about 15K lines of code). I'm trying to convince them that I've spend time on the project and it's in a good state.
These guys planning to invest money into this product. Therefore I should convince them that this app worth the price that they are going to spend and justify the time I've spent, secondly they should see that this is something takes time and I know what I'm doing (basically I need to win their trust) .
What metrics I can use other than "lines of code"? (Maybe lines of comment?)
What are the best tools (preferably free) to generate a report from .NET Projects?
UPDATE :
Also a way to provide "project cost - cocomo" would be cool, like this one :
FOUND:
http://www.cms4site.ru/utility.php?utility=cocomoii will help you to calculate an estimated cost for your project.
If they're non-technical, it won't matter. It will be like trying to sell a high-end bike to people who don't know a bike from a car. 15k lines of code won't matter to them any more than 300k lines of code will.
You need to find something other than the actual code to wow them with.
Can you code up some demos and tell them how short time it will take them to build similar applications with your code? Like "If you use my code, you can build this multimedia application in 15 minutes without writing more than a few lines of code". Non-technical people generally love saving time and money.
It probably depends on how "slightly" they are in the non-technical department.
An investor only cares about money. Investors start at the exit and work backwards. Knowing this, pitch your project in terms of the return they will get in their investment.
Key points would include:
Your expertise: Do you know the market you want to sell in to? Are you leveraging your expertise in some way to make the project a reality?
Risk: Using your already existing code base lowers risk in terms of both time and money. They will probably do technical due diligence to validate your claims, so be honest here.
Time to Market: Having a code base in place will reduce their time to market, which may be significant.
Vision: They need to know that there is a future for your product. This is your chance to get them excited!
Investment is about the future, not the past, so understand that you need to achieve what you are promising. The path you trod to get to where you are now may be interesting, but largely irrelevant to the investor. What I'm trying to say is sell the vision, not where you are now or where you've been.
Good luck and hope you get what you need!
It's not clear to me from your question whether you're talking about people who would buy the use of your product or ownership of your product.
In either case, ask yourself these questions:
"What problem(s) does this product solve for my users, from their point of view?"
"What does this product let the users do, that they already want to do, but can't do without it?"
"What does this product let the users do, that they already want to do, but can't do as easily without it?"
Features don't matter. Menus and dialogs don't matter (unless they require explanation, in which case they matter in the negative sense).
If you want numbers that interest a potential buyer of (an instance of) the product, talk in terms of how much time or money the buyer can save by using your product.
If you want numbers that interest a potential buyer of shares in your company or product, talk in terms of the size of the market, how you've analyzed that market's needs, and the ROI of any investment.
I've had success showing potential customers our automated build cycle, in slideshow form. I took them through our "production line" as if it was a factory tour, and showed the nice colored bars of coverage reports, uptilted lines of historical lines of code, pie charts of breakdowns of lines of code per module.
Then I did the same for everything aroung the actual building. So there's a requirements pipeline where they are involved, and a test/validation cycle where they are again involved.
It may not mean anything to them, but it shows them you have control over your process, and control over the quality of the delivered end product.
Please note that although people may be non-technical, try to be as honest as possible. As soon as they discover one single tiny lie in your story, you're lost. And chances are that there's that one technical guy in the back who can ask that one question which makes your house of cards fall down.
Happy sales!
"good code" doesn't matter unless you are demonstrating the medium and long term advantages of it - enhanced flexibility, simplicity, which saves customer time/money while adding agility.
I think explaining the more complex aspects of the code and the work that went into it to any audience will help show how much work and effort have gone into a project.
Hours spent coding could be a good metric to give them.
Talk about the features. Explain what you have working or almost working. Go at it from what they are interested in.
Try to show them visuals that they care about if you can. I think a few minutes doodling on a board would be better than showing lines of code.
The only thing that is likely to matter to a buyer (particularly a non-technical one)is functionality. I would concentrate on selling the features. You might consider discussion how you have tested it to verify that it performs as you claim.
I wouldn't use code per se, since a non-techie wouldn't understand it. Boasting about quantity is probably meaningless (how does a non-techie know that a 1MLOC project is significant? As for quality, you can present, e.g., maintainability metrics, test coverage, things like that. Feel free to show off your excellent toolchain too (continuous integration and all that), your mastery of various performance-testing tools. Also, showing things like Workflow Foundation helps - customers like to see how their business processes can be turned directly into code with a diagram notation.
EDIT modified to reflect OP's clarification (in comment here) that these potential buyers are looking to re-sell the software
Re-sellers are going to be looking for three things:
Is anyone going to make something better, cheaper or more quickly?
Is this guy going to be able to use our investment effectively to produce more?
Can we sell what this guy has produced, and will produce?
How to address points 1 and 2 have been very well addressed in other answers, but it's question 3 which is the hardest to prove for us techie people. It's also extremely important - if you can go to these buyers and hand them 3 killer benefits which they can repeat with more flair and Powerpoint when they're doing their sales calls, you'll be off to a good start :)
The main thing you have to do is to take a step back from your work and look at the:
features: what does it do
advantages: why is it better
benefits: why should the customer care
Features are closest to what you care about as a developer, but are pretty much irrelevant to non-technical buyers. Advantages are an essential step in understanding your competition and the customers' alternatives.
By putting features and advantages together, you can hit the customer with a number of benefits, e.g.:
using my software will save you $0.01 per transaction, or $40,000 p.a.
my software will increase customer retention by 5%
your admins will need 15% less time to deploy changes using my software
These are the things that customers care about: what's going to be good for the company, and good for them.
To be brutally honest: the end customer don't care how much effort you put into it (LoC or any other metric), they don't care how well it's written (comments, tests, any other metric), they don't care how hard a problem it is to solve, they don't care about features.
Their only requirement is that it will save them time / effort / money. You know that how hard you've worked to solve the problem, and solve it well, is key to their requirement, but it's secondary. You need to make it perfectly clear why them buying their stuff will mean they'll get promoted.
For COCOMO - Project Cost Estimation
I found this website, it's kind of a manual process but it'll do.
http://www.cms4site.ru/utility.php?utility=cocomoii