Integration Testing Principles

As I've written before, integration tests are different from unit tests, but that does not mean that unit testing tools such as Visual Studio Team System can't or shouldn't be used to define integration tests. However, many integration tests will, by their nature, require the presence of external infrastructure, such as a relational database, web services, queues, etc. This may seem to conflict with the ambition that a test suite should be fully automated and driven only by its code.

Basically, it should be possible to run a unit test suite simply by xcopying the unit testing code (and the test target code) to a machine, build the projects and run the tests. This enables a build server to perform buil verification tests in an automated manner, or a new developer to get started with a software project simply by getting the latest source from a source contol system, building and running the tests. With unit tests, this should usually be the case.

Integration tests may not completely be able to meet the xcopy requirement, since they will often rely on external infrastructure, but this doesn't mean that you shouldn't adopt a set of similar principles. More specifically, I recommend that integration tests should meet the following requirements:

  • Configuration should be minimal. If the integration test requires the presence of an external resource (such as a database, web service, etc.), the software implementing this resource needs to be installed on all machines where the test will run; that is, if you need to test data access logic, SQL Server must be installed; if you need to test queueing, MSMQ must be installed, etc. However, that doesn't mean that you should also require a user to configure a database on SQL Server, a queue on MSMQ, etc. Many products allow you to automate configuration, so this configuration should be part of the initialization and clean-up logic for the test suite. The end result is that you should only require minimal configuration to enable the test suite to run; often, this is equivalent to requiring that the product is installed on the machine, and that the test code has priviliges to perform automated configuration.
  • Test cases should be independent. This is a requirement inherited from unit testing in general, but in integration testing, this can often be more difficult to achieve. Particularly when you are dealing with a persistent store (such as a database or transacted queue), a test case will often leave the store in a state that is different from before the test case executed (e.g. if a test case deletes a row from a database table). A corollary to test case independence is that all test cases should begin in a known state. This means that it is necessary to write test initialization code that ensures that the external resource is in a known state.
  • Tests should be efficient. A less important ambition is that tests should execute as quickly as possible. While test case independence can be achieved by simply unconfiguring the external resource completely, and then reconfigure it again before each test, this may not be the fastest solution. If you consider a database, you could simply drop the database and recreate it before each test, but that's not the fastest solution - a faster solution is to clear out data from all tables between test cases.
  • The test suite should clean up after itself. When the test run is finished, it should leave the test machine in the same state as before it started. If it created any databases in SQL Server, it should delete these databases again; if it created any queues in MSMQ, it should remove these queues again, etc.

To perform initialization logic before the first test case is being executed, you can use the AssemblyInitialize attribute with Visual Studio Team System, and to clean up after the last test case, you can use the AssemblyCleanup attribute. To perform clean-up logic before each test case, you can use the TestInitialize attribute.

In a future post, I will describe in detail how to apply these principles while testing a data access component against a SQL Server database.

Comments

  • Anonymous
    November 17, 2006
    In a typical n-layer enterprise application, there's almost always a data access layer with one or more

  • Anonymous
    November 17, 2006
    In a typical n-layer enterprise application, there's almost always a data access layer with one or more

  • Anonymous
    December 04, 2006
    In my previous post about unit testing WCF services , I hinted at the need to perform integration testing

  • Anonymous
    December 20, 2006
    In my post about integration testing of WCF services , I hinted that one compelling reason to perform

  • Anonymous
    September 15, 2007
    If at all applicable, a well-written application should include one or more performance counters, which

  • Anonymous
    September 15, 2007
    If at all applicable, a well-written application should include one or more performance counters, which

  • Anonymous
    November 28, 2007
    It's no secret that I prefer unit tests over integration tests. Whenever it's possible to replace an

  • Anonymous
    November 28, 2007
    It's no secret that I prefer unit tests over integration tests. Whenever it's possible to replace

  • Anonymous
    January 31, 2008
    The comment has been removed

  • Anonymous
    May 13, 2010
    Hi: As part of my software testing course I want to make an integration testing of my corse using Visual Studio 2010. My questions are:

  1. Is this possible?
  2. If yes, how I can do it? Please help me!!!!
  • Anonymous
    May 13, 2010
  1. Yes
  2. Write the test code
  • Anonymous
    January 29, 2013
    Hi ploeh, I am doing my Master thesis on Integration testing and the testing tools used by microsoft for their products. Can i have some help in the form of articles or study links which help me evaluate the integration testing and the market tools by microsoft. thanks Hussain, Linkoping University, Sweden