Motley says: "If I write code to pass tests I'm cheating!"

 

Summary

 

Motley: Tests need to validate code that has already been written - not the other way around.

 

Maven: Writing tests before you code allows the tests to drive your code while providing more sound requirements validation.

______________________________

 

[Context: Motley is thinking through test-driven development before he goes off and tries it]

 

Motley: I've been thinking about this TDD thing and something is weird - if I write code to pass tests that's cheating! Tests need to validate code that has already been written.

 

Maven: Cheating? Hardly. Your tests are written around the requirements. What's wrong with writing code to execute the requirements? You're actually cheating if you write tests after the code. Now the tests are exercising the code but you are an extra step away from the requirements. Validation that you met the requirements is loose at best.

 

Motley: But I write my application code to satisfy the requirements. What's the difference?

 

Maven: The difference is that tests validate that the requirements are met, which is a logical layer between the requirements and the code. It forces you to think about the requirements in greater detail before implementing them. Plus, you get all the other benefits we talked about with TDD.

 

Motley: Well, maybe, but…

 

Maven: Just do me a favor and try out TDD for a couple of days, okay?

 

Motley: Fine, fine, fine. If it gets you off my back. Give me a couple of days to play. Now go make yourself useful and clean up the kitchen or something.

______________________________

 

Maven's Pointer: "Requirements" as described in TDD are usually described at a lower level than you may be used to. A "requirement" is one small nugget of functionality that you want to implement. It may be an entire (simple) method, or one positive or negative path through the method.

 

Maven's Resources:   Test-Driven Development in Microsoft .NET, by James Newkirk and Alexei Vorontsov, Microsoft Press, ISBN: 0735619484, 2004.

Comments

  • Anonymous
    April 16, 2007
    I've found it rare that unit tests can completely encompass the requirements for a program. Programs with complex inputs and outputs would be hard pressed to test all of their features with unit tests. There's still a need for a human tester to try new scenarios and think of ways the customer would use the product. Unit tests and automation simply validate that a path a tester has taken previously is still valid. Unit tests can help ensure you aren't breaking some core piece of functionality without knowing, but can they really be used as "requirements validators" for mid to large sized programs? Wouldn't we get into the habbit of "The light's green, ship it!"?

  • Anonymous
    April 16, 2007
    Thanks for the comment Chris! Unit tests address testing on a whitebox (low) level. They do their best to address the requirements as thoroughly as possible, but keep in mind we are talking about very low-level requirements here (e.g. boundary conditions in an API). There are also user scenarios and integration amongst the units that unit tests typically do not address. There is definitely still room for human testers. The test team tests integrated user scenarios (especially from the customer perspective), stress, performance, and creates and maintains more exhaustive test automation. With developers writing unit tests, the test team can then focus on the "right" thing and not have to find all the little issues devs should have caught. Unit tests cannot play the role of requirements validators on the system level - only on the unit level. We have to recognize the need for all levels of testing and should never ship a product just based on passing unit tests.

  • Anonymous
    April 26, 2007
    The comment has been removed