JW post on MS Testing - my observations

Check out James's recent post with the eye-catching title “if Microsoft is so good at testing, why does your software suck?”

I have another angle on this, which I saw first hand with our Whidbey release. It goes to some of the themes I refer to in my So you want to automate your test cases? post. We hire developers as testers. Our tester role is "Software Development Engineer in Test", or SDET. Our SDETs are awesome, our testers here at Microsoft are way better than testers I have worked with previously. But one problem I've seen is that because of our huge emphasis on test automation at Microsoft, sometimes ad-hoc or exploratory testing suffers because of it. That is, sometimes we're so busy writing automation, automation harnesses, and automation tools, we don't spend enough time in our customers shoes banging on our features and trying to break them. Almost all of the bugs we find are found through manual testing, not through automated testing. On my team we brainstorm on tests to run in order to break new features we're developing, schedule specific tasks in TFS for testers to beat on features, and also have dedicated bug bashes for our features.

Of course we can always do better, the recent load test regression that got through in SP1 is evidence of that. I wish we had had an automated test for that one, we do now. :) The timing of finding this bug was unfortunate, and we had another like it in RTM, was that one of our internal customers from the Services Test Labs found this bug in SP1 beta just as the door closed on SP1. We need to drive more adoption of our betas so we can catch these problems before RTM. The other thing is we were pretty aggressive in SP1 about getting fixes and features in that we heard about from customers. This regression was caused by one of those customer-reported problems ("Thinktime that's set via plugin code is ignored"). I don't see that changing, SPs are a chance to get value back to you, we need to take advantage of that. Of course we want to be smart about that, and make incremental changes that pack a lot of power but do not introduce a lot of risk.

Ed.

Comments

  • Anonymous
    August 19, 2008
    Just a gadfly suggestion, but you SDETs could have a field day going through the trouble tickets that the connect screeners close as "could not reproduce".  They don't seem to try very hard.

  • Anonymous
    August 20, 2008
    The comment has been removed