Not Everyone Has the Same Definition of "Done"
Years ago I had an employee, let's call him Vanya (not the real name). He was struggling a bit so I was watching his work closely. Every week we discussed what he needed to get done the next week and what he had done the previous week. I kept a list of the work items he needed to complete and checked them off when he was done with each. For one particular work item which was testing a particular DirectShow filter, the item on the list was writing tests for it. One week he worked on and completed this work. A few months later we became aware of an issue that would fundamentally cause the filter to not work. In fact, it had probably never worked. Why, I wondered, didn't Vanya's tests catch it? I went to speak with him. It turns out, he had written the tests. They had compiled. He had not, however, ever actually run them. They were "done" in his mind, but not in mine. Oh, and what he had written didn't actually work. Shocking, I know.
I tell you this story to introduce a problem I've run into many times on many different scales. This story is probably the most aggregious, but it is certainly not isolated. The problem stems from the fact that we rarely define the word. It is assumed that everyone shares a definition but it is rarely true. Is a feature done when it compiles? When it is checked in? When it can run successfully? When it shows up in a particular build? All of these are possible interpretations of the same phrase.
It is important to have a shared idea of where the finishing line is. Without that, some will claim victory and others defeat even when talking about the same events. It is not enough to have a shared vision of the product, it is also necessary to agree on the specifics of completion. To establish a shared definition of done, it is necessary to talk about it. Flush the latent assumptions out into the open. Before starting on a project, it is imperative to have a conversation about what it means to be done. Define in strict terms what completion looks like so that everyone will have a shared vision.
For large projects, this shared vision of done can be exit critera. "We will fix all priority 1 and 2 bugs, survive this many hours of stress, etc." For small projects or individual features in a large project, less extensive criteria is needed, but it is still important to agree on what will be the state on what dates.
While not strictly necessary, it is also wise to define objective tests for done-ness. For instance, when working on new features, I define "done" as working in the primary scenarios. Bugs in corner cases are acceptable, but if a feature can't be exercized in the main way it was intended to be, it can't be tested and isn't complete. To ensure that this criteria is met, I often insist on seeing the feature demonstrated. This is a bright line. Either the feature can be seen working, or it cannot. If it can't, it isn't done and more work is needed before moving on to the next feature.
Comments
Anonymous
September 05, 2008
Hmm... I just had almost this exact same conversation with a coworker earlier today. Your example and conclusions are right on the money. Well posted.Anonymous
September 07, 2008
> I often insist on seeing the feature demonstrated. Revolutionary; having requirements actually function is something I'll bring up with the team.Anonymous
September 07, 2008
Oh; just noticed the blog site. This may help explain some of my experiences with Windows :(Anonymous
September 07, 2008
The comment has been removedAnonymous
September 09, 2008
Development methodology is far less relevant than delivery of testing artifacts. Make the delivery of JUnit and structural coverage reports mandatory and the problem in large part goes away. The real problem is that management doesn't want to sacrifice the schedule for developers to produce these artifacts, hence these issues.Anonymous
September 09, 2008
The comment has been removedAnonymous
September 09, 2008
Well said. No doubt an exact definition of done for each assigned task will do much to improve unpleasant surprises in managing a project. And I also wholeheartedly agree with you on the fact that it occurs all to often that people have a misunderstanding of what a word means (or two people are using different definitions or some other scenario than both people having a clear-cut concept of what is meant) - and that the solution is simply to get the word or words defined. However, there is another important point you've over looked: Why did Vanya not consider that his software had to actually work? Factually, even if there is not an agreement on what exact types of tests must be run to achieve "done" (although I wouldn't discourage this - I agree with you on this and think it's a good thing), the fact that a feature must work is unavoidable. Why? Having a feature be demonstratable by tests cases or other means results in something that can be committed back to the repository and is a "sub"-product. This "sub"-product can then be used as part of the overall project which can be turned over to steakholders - the valuable final product. If it doesn't work, it can't - someone else has to patch it up and make it into a real sub-product. In any activity (not just software development), you've got a sequence of actions and sub-products which results in an overall product. Each member of the team has to have at least a basic concept of what those actions are and why they are and how it leads to a real valuable final product which gets exchanged with steak holders and results in a paycheck for them. If Vanya had a concept of that, it would never occur to him that writing code that he didn't know worked was enough. I say this because you're not talking about a bug that was found because of a missing test case, you're talking about an omitted definition of the word "product". Vayna thinks that a product is something that he's written. You think (and are correct that) a product is something that runs as described by whatever design you have and so effectively contributes to the overall project by completing a section of it. Get that sorted out with your people, and you'll find yourself having to do a lot less hand-holding and with staff that are smarter and more responsible for their jobs. I recommend looking at (at least) this document for some more detail on this (the section entitled 'Organizing and Hats' is particularly pertinent): http://www.volunteerministers.org/download/booklets/basicsorganizing-en.pdfAnonymous
September 09, 2008
Not everybody has the same definition of "Yes," either. These issues are exacerbated by cultural, geographical and time zone differences (although they can happen with the fellow in the next cubie, too :-)Anonymous
September 10, 2008
You've missed a very important part of the whole development experience. Robert Martin talks about a module having three functions. The first is the one you mentioned, the function it performs while executing. But a module must also afford change and reveal its intent. A module that is hard to change is not done. A module that hides its intent is not done.Anonymous
September 10, 2008
You realize this is the same problem we encounter everyday writing code to begin with. The comiler does its thing and blinks back at us, innocently assuring us that our instructions work. Sadly, we have learned that we can't take the compiler's word for it and have to aggressively anticipate everything that we can imagine could go wrong. Or at least I have had some epic failures ... Extremely analogous are our relationships with people you need to transmit instructions to. You may have been clear. You were left with the impression that the resulting code would work, but at the social level you had not been an effective tester. As with coding, it takes imagination and practice to figure out what could go wrong. Trust needs to be hard to win. You came close to getting it right back then, but I imagine your experience has influenced your relationship with every programmer working with you as your mental catalog of things that can go wrong lengthened. The good news is that when you trust someone's work now, you are more certain of their trustworthiness.Anonymous
September 18, 2008
The comment has been removed