Integration testing is not so easy to grasp, because many things can be integrated. Modules, components, layers, services or entire systems. Integration tests differ from one another in a number of aspects. Different test environments, testers, test data or even the test basis. In the practice of software testing, two types of integration can be roughly distinguished:
I will discuss the special features of these two test levels separately. Here I will first provide an overview of what all integration tests have in common and what should be considered. As integration comprises at least two or more parts, different developers, teams or stakeholders are often involved. Different responsibilities therefore increase the amount of coordination required for testing. In practice, I often find that integration testing is ignored and postponed for too long as a result. In addition, a kind of shock paralysis often arises in this context: Who should start? Who wears the hat? Where do we start? For this reason, I have dedicated an entire book to this topic: “The integration test - from design and architecture to component and system integration”.
Integration testing became immensely important when object orientation took off. The integration and testing possibilities suddenly increased by leaps and bounds.
The ISTQB defines integration testing as: “A level of testing focusing on the interaction between components or systems”, and further distinguishes between component integration testing and system integration testing.
I find this subdivision very appropriate, as it already covers two large areas. But it is still essential to look at your own context to see which integrations are still important: Environments, databases, infrastructure, data, layers, etc. And to define these, of course: What does environment integration mean for us?
All information that describes the interaction of the individual parts serves as the test basis for the integration test: interface specifications, sequence diagrams, use cases,…
Here, too, it is important to note that there are also non-functional aspects: How fast should the exchange be? How much load is placed on the connection? How robust is the connection? Here is an example from one of my projects: Three systems from different manufacturers were linked. To save effort, the interfaces were implemented very openly. This makes integration easy because not much needs to be coordinated. However, it also means that errors in the data and structures are simply passed on, possibly to the next system and the one after that. This was also the case here and troubleshooting became incredibly time-consuming.
Structured test case design methods such as equivalence class formation, limit value analysis, decision tables or state-based testing are suitable for creating test cases. A large number of test cases can be derived from the description of the interfaces and the interaction of the parts.
Negative tests, i.e. test cases that use the wrong structures or data, deserve special attention here. This is because these can quickly bring down an interface or a workflow.
When creating test cases, gaps in the specification will inevitably become apparent and questions will arise. These need to be clarified quickly with the software architects, developers or specialist departments.
The test object for the integration test is at least two integrated parts of the whole. In practice, there are various integration approaches, e.g. top-down or bottom-up. Depending on the strategy chosen, missing parts must be simulated so that the tests can be carried out.
The aim of the integration test is to check the interaction of the parts via interfaces or workflows. The primary focus is on: Is the data correct? Are the structures correct? Are the non-functional aspects correct?
The test environments differ for the various integration tests. Component integration tests may still be able to run in the development environment, but for system integration tests you need a dedicated test environment such as the system test.
An important aspect of this is monitoring. In order to analyze the communication between parts, you need tools that make this visible. There is also a potential source of danger here, namely that these tools change the communication. Perhaps not in terms of content, but possibly the runtime.
Test data management is also heavily dependent on the integration level. For component integration tests, the unit tests can be used as a guide. For system integration tests, use the system test data.
Test automation in the development-related integration test stages is a no-brainer. There are tons of tools and instruments that provide support here. It is more difficult at higher integration levels, such as the system integration test. This is because at least two different systems have to be automated here. If these also differ technologically, this can have a massive impact on the effort required for test automation.
You can also find a few ideas in my book “Basiswissen Testautomatisierung”.
The model of test levels comes from a time long before agile projects. And is therefore often ignored in Scrum and the like. But if we take a closer look at the model, we can see how important the ideas and aspects of the test levels are in agile contexts too, of course. And integration tests in particular are extremely important. They help to ensure robustness on a small scale (with the components) and on a large scale (with other services, systems). The integration test also appears again and again in variants of the test pyramid.
When I’m new to a project, there’s often a problem with integration tests:
Unit, system, acceptance or acceptance tests can be found in a wide variety of forms. But integration tests are often only part of the unit or system tests - or not at all. More awareness is definitely needed here.
When integration tests exist, the biggest problem is usually the organization: Who is responsible for the integration? This or that system? This or that project? This or that developer or architect? If this has not been decided, the result is that nobody feels responsible and therefore there is no focus on this topic. Some of my customers also like to define their own test manager for integration tests, which gives the topic more importance.
Integration tests are essential and will become even more important in the future. The ever-increasing networking of systems and components, the provision of services and APIs - all of this also contributes to increasingly efficient testing. In everyday life, this is not always so easy, as infrastructure issues and responsibilities need to be clarified. But you have to get through this valley of tears.
The tools that support this are getting better and better. There is a wealth of development-related options. The use of standardized interface frameworks also makes it easier to carry out and test integrations across systems.
There is only one thing to avoid: not doing any integration tests.