Guidance for Creating Test Plans and Test Suites

Customers trying to start up with MTM newly, often ask for general guidance around how to start from scratch on a new project. Here is some guidance that we recently published in MSDN that should help answer exactly this question:

When you use Microsoft Test Manager to create test plans and test suites for your team project, there are several approaches that you can take. This topic will focus on two approaches that are based on your development methodology.

One approach is to create one test plan that you use for all milestones and add test suites and tests as you progress. However, if you use this approach, you do not have historical data for your test pass rates for previous milestones. Therefore, it is better to create test plans based on your testing goals for specific iterations or milestones, whichever development methodology you use. By creating test plans for iterations or milestones, you can see when a specific iteration or milestone is complete, based on your testing goals. You can also prepare the test plan for the next iteration or milestone while you complete your testing for the current milestone. By using this approach, you can track your testing progress for each of your test plans and see that the quality of your application is improving.

If you add both manual and automated tests to your test suites, you can view the overall quality based on both of these types of tests for your test suites and test plans.

Use the following sections based on the methodology that you are using to develop and test your application:

 

Agile Development


 

 

If you are using an agile methodology to develop and test your application, you typically create user stories and use sprints and iterations to track the progress of your development and testing tasks. You can use test plans to correlate with each sprint. For example, you might have the following user stories for your Web application:

  1. A user wants to select multiple products from the website and add them to their shopping cart. (Sprint 1)

  2. A user wants to purchase the items in their shopping cart using a credit card. (Sprint 1)

  3. A user wants to save their information when purchasing items to make it quicker to purchase next time. (Sprint 2)

  4. A user wants to sign into their account when purchasing items to retrieve their personal information instead of entering it again. (Sprint 2)

The following steps assume that these are the user stories for the project. You will want to create test cases for these user stories. You might also want test cases that test end-to-end functionality for multiple user stories that can be joined together. For example, you might want to test that a user can select items, add them to their shopping cart, sign in and purchase the items. By following these steps, you will have a set of test plans, as shown in the following illustration:

agile

Project Set Up

  1. At the start of your project, create the following test plans. (This is based on the number of sprints that you plan to have.)

    1. Sprint 1 Test Plan

      This will be used for testing Sprint 1 user stories.

    2. Sprint 2 Test Plan

      This will be used for testing Sprint 2 user stories and any necessary regression testing from Sprint 1.

    3. Master Test Plan

      This will be used for end-to-end tests that span more than one sprint. It can also be used for performance tests for service level agreements. This test plan does not have to be associated with a specific iteration, because it spans multiple iterations and can only be complete when all milestones are completed.

  2. Determine the test configurations that you need to use to test your user stories. For example, you might want to test that your user stories for your application run on Internet Explorer 8 for configuration 1 and Firefox 3.5 for configuration 2. Then create these test configurations using Microsoft Test Manager. 

  3. Add the test configurations that you need for your user stories to the test plan. By default these will be used for any test suites that you create in the test plan.

Sprint 1 Testing

  1. Add the user stories 1 and 2 for sprint 1 to the Sprint 1 Test Plan to create two requirement-based test suites.

  2. Check the test configurations that you need for the test suites for the user stories are correct. By default, each test suite is set up to use the test configurations for the test plan.

  3. Add the acceptance test cases to these test suites for user stories 1 and 2. For example, you might add the following test cases to the appropriate test suite:

    1. User Story 1: Add one item to the shopping cart

    2. User Story 1: Remove an item from the shopping cart

    3. User Story 2: Purchase a single item from the shopping cart

      These test cases are automatically associated with the user story if they are added to the test suite created from that user story. 

  4. If you create any automated tests, you can add these to the test suites. For example, if you have any unit tests or coded UI tests, you can associate these with test cases and add these to the test suites. You can add these tests whenever they are ready during the sprint.

  5. Add any end-to-end test cases that you know you will want to test as the user stories are completed to the end-to end test suite in the master test plan.

  6. When a user story is ready for testing during the sprint, set the test suite status for the test plan to In Progress. 

  7. From the Run Tests view, you can select the test points that you want to run. A test point is a pairing of a test case with a test configuration. For example, tester A has a machine set up with Internet Explorer 8 only. Tester A selects all the test points for a user story that must run on Internet Explorer 8 and runs these. Tester B selects all the test points for a user story that need to run on Firefox 3.5 and runs these.

  8. When all the manual and automated tests for the test suite for that user story have been completed, you can view the testing status for that test suite. In the Test activity, choose the Run Tests view. You can also run reports to see the status. Based on the quality goals that you have for each sprint, you can determine if the sprint testing tasks are complete.

  9. When Sprint 1 is complete, you must determine which tests you need to run as regression tests for the next sprint to make sure that the development for the new user stories does not break the functionality for Sprint 1 user stories.

  10. Create a test suite called Regression in the Sprint 2 test plan. Then add the test cases that you identified for these regression tests to this test suite in the Sprint 2 test plan.

 

Sprint 2 Testing

  1. Add the user stories 3 and 4 for sprint 2 to the Sprint 2 Test Plan to create two requirement-based test suites.

  2. Add the acceptance test cases to these test suites for user stories 3 and 4. For example, you might add the following test cases:

    1. User Story 3: Create a log in account

    2. User Story 3: Check out without creating a log in account

    3. User Story 4: Sign in to the log in account. (You can add parameters to this test case to sign in with different log ins.)

    4. User Story 4: User forgets password

    5. User Story 4: View orders for account

      You can add the test steps when you create the test cases, or another tester can add the test steps when it is clearer what the steps will be.

  3. If you create any automated tests, you can add these to the test suites. For example, if you have any unit tests or coded UI tests, you can associate these with test cases and add these to the test suites. You can add these tests whenever they are ready during the sprint.

  4. Add any new end-to-end test cases that you know you will want to test as the user stories are completed to the end-to-end test suite in the master test plan.

  5. When a user story is ready for testing during the sprint, change the state of the test suite to In Progress. Then run the manual and automated tests for the test suite for that user story.

  6. You can now view the testing status for each test suite from the Run Tests view in the Test activity. You can also run reports to see the status. Based on the quality goals that you have for each sprint, you can determine if the sprint testing tasks are complete.

  7. Run any performance tests or end-to-end tests that are appropriate for this sprint.

  8. When Sprint 2 is complete, you must determine which tests you need to run as regression tests for the next sprint (if there is one) to make sure that the development for the new user stories does not break the functionality for Sprint 2 user stories.

  9. In the test plan for the next sprint (Sprint 3), copy the test suite called Regression from the Sprint 2 test plan. Then add the test cases that you identified for these regression tests to this test suite in the Sprint 3 test plan. 

Continue this process for each of your sprints. By using this approach you will create a set of test plans for your sprints. You will also build up a test suite of regression tests that is carried forward to the next test plan. For a key milestone such as Beta 1, you might choose to rerun some or all of the tests from your sprints. You can use these same techniques of creating a test plan for this milestone, which is named Beta 1, and then copying test suites to this test plan. This way you can record the testing results separately for this test plan and compare them to the individual sprint test plans.

Other Development Methodologies


 

 

If you are not following an agile methodology, your development and testing tasks are likely to be based on features. But you may also use requirements instead of user stories. If you use requirements, you can use the approach in the agile development section and create test plans for a specific milestone instead of a sprint and then add the requirements to your test plan. For example, you might have a Beta 1 test plan with all the requirements for Beta 1 added as test suites. You can then add acceptance test cases and unit tests to these test suites and associate your test cases with the requirements. 

If you are using a more feature-based approach, you might have the following features for your Web application:

  1. Shopping Cart (Alpha)

  2. Log In (Alpha)

  3. Check Out (Beta 1)

  4. View Orders (Beta 1)

The following steps assume that these are the features for the project. It is also assumed that a feature will be associated with a specific area path for the team project. You will want to create test cases for these features. You might also want test cases that test more than one feature specifically. For example, you might want to test that a user can add items to their shopping cart, sign in, and purchase the items. By following these steps, you will have a set of test plans as shown in the following illustration.

 cmmi

 

Project Set Up

  1. At the start of your project, create the following test plans. (This is based on the number of milestones that you plan to have.)

    1. Alpha

      This will be used for testing the features that will be available for Alpha.

    2. Beta 1

      This will be used for testing the features that will be available for Beta, including any changes to the features from Alpha feedback or additions to Alpha features.

  2. Determine the test configurations that you must have in order to use to test these features. For example, you might want to test that these features for your application run on Internet Explorer 8 for configuration 1 and Firefox 3.5 for configuration 2. Then create these test configurations using Microsoft Test Manager. 

  3. Add the test configurations that you need for your features to the test plan. By default these will be used for any test suites that you create in the test plan.

Alpha Testing

  1. Add a test suite for Shopping Cart and a test suite for Log In to the Alpha Test Plan. You can create these as static test suites and then add test cases to these suites. 

    Important noteImportant

    You might also select an area path when you create test cases based on the area of the product that is tested. Area paths often map to features or a set of features. If you do this, then you can create a query-based test suite based on a query for this area path. Whenever you add a test case to this area path, it will automatically be added to the query-based test suite. This can help with maintenance of your test suites. In this example, you can create a query-based test suite for area path equal to 1 and another for area path equal to 2 instead of the static test suites. 

  2. Check the test configurations that you need for the test suites for each feature are correct. By default each test suite is set up to use the test configurations for the test plan.

  3. Add the test cases to these test suites for their respective features. For example, you might add the following test cases to the appropriate test suite, or just create test cases with the correct values for the area path if you created a query-based test suite:

    1. Shopping Cart: Add one item to the shopping cart

    2. Shopping Cart: Remove an item from the shopping cart

    3. Log In: Log in to a user account

      NoteNote

      You can add the test steps when you create the test cases, or another tester can add the test steps when it is clearer what the steps will be.

  4. If you create any automated tests, you can add these to the test suites. For example, if you have any unit tests or coded UI tests, you can associate these with test cases and add these to the test suites. If you created a query-based test suite based on the area path, you must ensure that the value for the area path is correct for these test cases. You can add these tests whenever they are ready during Alpha testing.

  5. When a feature is ready for testing during the Alpha phase of the project, set the test suite status for the test plan to In Progress. 

    NoteNote

    You might also want to add an exploratory test case to use to do exploratory testing for a feature. This test case can be created with just one test step that you use to explore this feature and record your actions in case you find a bug.

  6. From the Run Tests view, you can select the test points that you want to run. A test point is a pairing of a test case with a test configuration. For example, tester A has a machine set up with Internet Explorer 8 only. Tester A selects all the test points for a user story that need to run on Internet Explorer 8 and runs these. Tester B selects all the test points for a user story that need to run on Firefox 3.5 and runs these.

  7. When all the manual and automated tests for the test suite for the feature have been completed, you can view the testing status for that test suite from the Run Tests view in the Test activity. You can also run reports to see the status. Based on the quality goals that you have set for Alpha testing, you can determine if the testing tasks are complete.   

Beta 1 Testing

  1. Copy the test suites from the Alpha test plan to the Beta 1 test plan. 

  2. If you are using static test suites, add a test suite for Check Out and a test suite for View Orders to the Beta 1 Test Plan. If you are using query-based test suites for the area paths, then any tests that you create for area path 1 or 2 will be automatically added to the test suites that you copied over from the Alpha test plan.

  3. Add a test suite called End to End to the Beta 1 Test Plan. You can add test cases to this test suite to test the end-to-end scenarios that include more than one feature.

  4. Add the test cases to these test suites for these new features, or just create the test cases with the correct area path values if you are using query-based test suites. You can also add test cases for changes to functionality for Alpha features or new additions to these features. For example, you might add the following test cases:

    1. Check Out: Check out items from shopping cart

    2. Check Out: Check out without creating a log in account

    3. Login (additional test case): User forgets password

    4. View Orders: View orders for account

    5. End to End: Add an item, login and checkout

      You can add the test steps when you create the test cases, or another tester can add the test steps when it is clearer what the steps will be.

  5. If you create any automated tests, you can add these to the test suites. For example, if you have any unit tests or coded UI tests, you can associate these with test cases and add these to the test suites. You can add these tests whenever they are ready during Beta 1.

  6. When a feature is ready for testing during Beta 1, change the state of the test suite to In Progress. Then run the manual and automated tests for the test suite for that feature.

    NoteNote

    You might also want to add an exploratory test case to use to do exploratory testing for each new feature for Beta 1. This test case can be created with just one test step that you use to explore this feature and record your actions in case you find a bug.

  7. You can now view the testing status for each test suite from the Run Tests view in the Test activity. You can also run reports to see the status. Based on the quality goals that you have for Beta 1, you can determine if the testing tasks are complete.

  8. Run any end-to-end tests that are required for Beta 1.

If you have more milestones for your project, you can continue this process for each of your milestones. By using this approach you will create a new test plan for each milestone. You will also build up a test suite of end-to-end tests that are copied over to the test plan for the next milestone. If you do not have sufficient time to run all the tests in a test suite from a previous milestone, you might limit the tests in the test suite that you have copied over. For example, you might limit this to Priority 1 tests only. If you are using query-based test suites, then you can change the query to add in the priority. If you are using static test suites, you can just remove test cases that you do not need to rerun for the milestone.

Comments

  • Anonymous
    September 22, 2010
    the images aren't working[Anu] Fixed them. Thanks for reporting, Clemens
  • Anonymous
    October 04, 2010
    Links are dead ex, Defining Your Testing Effort Using Test Plans.[Anu] Yikes - thanks for reporting. Fixing it now
  • Anonymous
    January 19, 2012
    Anu, this was much needed and thanks for a lot for doing it. appreciate it.