12 min read

Digitalization leads to DevOps

Modern IT systems are never and always finished

*Digitization requires continuous development, testing and delivery of the software application systems used. These three activities should no longer be carried out separately, but together. Users are involved in the development process.

DevOps is a response to the demand for continuous development. In principle, modern IT systems are never finished. They have only ever reached a stage where some of the user requirements have been satisfied. However, both the requirements and the framework conditions continue to develop and change. The challenge for DevOps is to configure the systems in such a way that, on the one hand, they can be constantly developed further, but on the other hand, they are always ready for use in a defined quality. Users must accept that the system is constantly changing, but must also be able to rely on what has been achieved while development continues.

Digitalization and DevOps

Digitalization is a very abstract term for the automation of all areas of human life. Formally, it says: “The term digitization generally refers to the changes in processes, objects and events that occur with the increasing use of digital devices. In the original and narrower sense, this is the creation of digital representations of physical objects, events or analog media.” The prerequisite for this is the recording of all data about the areas of human life in digital format, i.e. as bits and bytes. In analogue data processing, data is interpreted as changing flows or changes of state. In digitization, the states are recorded as bit patterns. They can be stored and processed again. Digitization gives us the opportunity to automate all operational applications and thus redesign them at will. The problem is that automating an application is not a one-off project. Systems change regardless of their degree of automation, and when they are automated, they only change faster. Once you have started to automate an application, you have to keep at it. There is no going back. The know-how required to operate a system manually will also decline in the process. As promising as it may sound, digitalization is a pact with the devil. Automation also means dependency. Goethe’s Faust sends his regards.

Goals of digitization

There are three main objectives when digitizing an organization:

  • the electronic storage of as much data as possible,
  • the automation of as many processes as possible,
  • the networking of as many system players as possible.

Data is all the information that the system needs to fulfill its function, for example customer data, employee data, order data, production data and storage data. Data can be stored on electronic storage media, on paper or in the heads of employees. The aim of digitization is to have all data on electronic storage media, as this is the only way to make it accessible to everyone and everything.

Processes are operational procedures such as order processing, production planning and control, warehousing, delivery and invoicing. Processes are carried out by actors. The actors can be people or machines. In a semi-automated process, the actors are partly people and partly machines. In a fully automated process, only machines are at work. Many processes can be fully automated. Others are only partially automated. One goal of digitalization is to automate as much as possible.

In an organization, the players communicate with each other and with players outside their own system. They are networked with each other. They can exchange data with each other via the exchange of papers, via personal conversations or via electronic messages. The devices as actors can also communicate with each other directly and not via people. This is why we talk about the “Internet of Things (IoT)”. A company is a network of communicating nodes. One goal of digitalization is to connect as many nodes as possible. This means that everyone can exchange data with everyone else and everyone can exchange data with everyone else. In principle, digitization makes it possible to react more quickly to changes in the environment. Both the data and the operations on the data are recorded as bit patterns and bit patterns can be easily overwritten. New bit patterns are created in a matter of seconds, reshaping the meaning of the data and its processing. This gives us the opportunity to constantly change our work processes.

It all sounds so simple. All we have to do is record the data electronically, program the processes and digitize the communication. But everything that is “digitized” must also be tested. Errors can occur, incorrect information can be included or, even worse, dangerous combinations of data? Nothing changes in this respect. Testing blocks the rapid implementation of requirements, especially if testing and maintenance are separate from the implementation of requirements. When a new law is passed, it used to take months for the law to be implemented. This is no longer justifiable in our fast-moving age. Testing and handover must be accelerated.

The path to digitalization

First of all, we need to understand exactly what is new about digitization. We have had electronic data processing for a long time, automated processes too and broadband data transmission anyway. The fact that wireless communication plays such an important role in our everyday lives is not something that has only been around since yesterday. What may be new is the combination of all this. Ultimately, a company is digitized when all its data is stored electronically, when all work processes are fully automated and when all employees are connected to each other and to their devices.

The question arises as to whether it is really sensible to go this far or whether we should stop halfway. How far do we want to take digitalization? Here, too, the test has current limits. We can only store, automate and remotely transmit as much electronically as we can vouch for the content. The data that we store electronically must not be corrupted. The questions we ask of the data management system must be answered correctly. The processes that we initiate must be carried out in accordance with the applicable regulations or business rules. And, the messages we send must not be altered or intercepted en route. This means that everything we digitize must be tested and secured.

Digitization means an enormous amount of testing for us. It starts with testing the data and continues with testing the processes and data transfer. This effort can only be managed if we proceed systematically and if the test itself is automated as far as possible. The degree of digitization can be understood as the proportion of electronically stored data relative to the sum of all operationally relevant data plus the proportion of all automated processes relative to the sum of all operational processes plus the proportion of networked system nodes relative to the sum of all system nodes.

It may be that this measure is not relevant from a business perspective, but as a goal for IT it is a good one to start with. True digitalization, if there really is such a thing, means drastic changes to existing products and business models, i.e. new systems. Whether users are ready for this or have a choice is an open question.

Electronic storage of all operational data

The fact that we are physically able to store all data has been undisputed since the introduction of cloud computing. Cloud providers offer us unlimited storage capacity. The question is whether we are also storing useful data and not just junk data. The data must also be correct. Invalid values must not be allowed to creep in. Accordingly, there would be two criteria for the data:

  • they are useful,
  • they are correct.

To determine whether the data is useful, it must be tested with all operational applications, including the query and reporting systems. This checks which data is used and which is not. The aim of this test is to create an inventory of all the data used. It is up to the users to decide whether the applications themselves are useful. To determine whether the data is correct, it must be checked against the rules of correctness. There are two ways of doing this. Firstly, the data can be compared with each other. New data can be checked against old data or data in one application can be checked against data in another application. The data should have identical value ranges wherever it is used.

Secondly, the actual values can be compared with the target values. The simplest type of check is a visual inspection of the data by a specialist. However, this type of check is time-consuming and error-prone. The most elegant type of data check is the automatic value check against rules (assertions). The assertions specify which values the data concerned should have. They can be defined as value ranges, alternative values, calculated values or relations to other values, among other things. The purpose is to identify and sort out invalid values.

However, this automated approach requires rules to be specified for all data elements. There is no 100% guarantee that the data is really correct with either approach, but it does help to identify the majority of invalid data.

Despite the acceleration of the release process, the quality of the product must not be sacrificed. Quality assurance of electronically stored data and automated processes is an indispensable prerequisite for operational digitalization. This can only succeed if the users are involved. The users must accompany the release process and accept the individual steps immediately. This means that representatives of the end users must be involved right from the start. They are indispensable for ensuring quality.

Automation of operational processes

Many processes within today’s organizations are already automated, but in some cases with very old software systems - so-called legacy systems. Where processes have a high degree of standardization, the software solutions are packaged as standard systems. This applies to a large proportion of business applications. More and more production systems, logistics systems and traffic management systems are also covered by standard products. Off-the-shelf standard packages are the preferred solutions for process automation. They are cheaper, more reliable and easier to maintain. Above all, the supplier vouches for their quality and ensures their further development. Standard systems have already been tested many times and have a level of reliability that individual solutions can rarely achieve. Nevertheless, many users still feel compelled to knit their own customized solutions. They can start on a greenfield site and build everything from scratch, or they can start with ready-made components and assemble them into a whole. They can still develop missing components themselves. It is important that the new application systems consist of as many prefabricated and pretested components or services as possible.

A service-oriented architecture with microservices is a promising way to achieve this goal. Despite advances in software technology, new developments are still associated with enormous costs and risks. The costs of testing automated processes are enormous. To be considered trustworthy, every possible flow path with all possible states must be tested. In short, it can take years to adequately test large, complex processes without automation. Test automation can speed up testing, but even automated testing has its limits. Humans still have to assign the test data and only the subject matter expert can specify the expected results. Once the test of a new application has been tried and tested, it can be automatically mutated and repeated, but it has to get that far first.

This is why testing remains the main obstacle to automating processes. The required software is quickly written or generated, but until it is confirmed in all facets as a complete and reliable solution, it can take a long time and cost a lot. As long as humans are developing the software, it will be developed incorrectly. Faulty software makes it risky to automate operational processes. The risks of automated processes - like the risks of electronically stored data - are reduced by testing.

The quality of new software can also be determined by static analysis, but only dynamic testing can reveal whether the automated process really behaves as it should. Anything that helps to provide this proof faster and more cost-effectively is to be welcomed, but the fastest and cheapest solution remains the use of already proven software, either as a whole or as individual building blocks.

Networking of all network nodes

The third pillar of digitalization, alongside electronically stored data and automated processes, is the omnipresent exchange of data. Information obtained must be transmitted everywhere and at all times, from the sensors on the devices, from the monitors that oversee those automated processes, from the data storage devices and from the people who record data. All elements of a system, the people as well as the machines, must be networked with each other. There is human-to-human, human-to-machine and machine-to-machine communication - see also “Internet of Things”.

However, communication must function flawlessly. Software can monitor itself, but to do so it must be able to send and receive messages. The exchange of digitized messages is the kit that holds a digital environment together. However, it must be interference-free and secure.

Secure communication is communication in which everything that is sent is also received. What is sent must not be intercepted, copied or intercepted en route. The messages that are transmitted must remain sealed. Only the intended recipient may open them. This applies to both people and devices.

Increasingly, devices and software components are sending messages to each other. The sender of a message - whether human, hardware device or software service - must be certain that the data sent will actually reach its destination unchanged and the recipient must be certain that the data received has not been misused, manipulated or intercepted en route. This can only be guaranteed via a secure communication network. Data transmission must therefore also be tested. We need test tools to generate and send messages, tools to receive and validate messages, and tools to track the messages as they travel through the network. These tools must recognize when and where messages are intercepted. The communication test ensures that the data transmission works properly.

Monitoring data transmission plays a special role in testing digitized companies. This is because it is not only about the correctness of message transmission, but also about its security. Interventions in the communication network must be simulated in order to test how the network reacts to them. Interception, redirection and unauthorized use of messages must be ruled out. The communication network can only be considered secure if all these simulated attacks are recognized and rejected. This also includes attempts by unauthorized persons to break into the network. Of course, all intrusion possibilities must be tested. The security of message transmission is an essential prerequisite for digitization.

This shows how much effort is involved in testing communication. Anyone who is serious about digitalization must be prepared to pay a high price for quality. Otherwise they will always remain vulnerable.

Digitalization requires continuous testing

In order to meet the challenge of digitalization, we need to look at how we can make progress with testing.

As there is so much to test in such a short space of time, test automation must be driven forward. But that alone is not enough. The users must be involved in the test. They must not wait on the sidelines until the test is finally declared complete. In fact, the users are responsible for the test. Because only the end users can decide whether the test is sufficient for use. They bear the risk if something goes wrong. Of course, they are advised by experts, but only the end user can decide whether a product is ready for use.

The user is also responsible for checking the data. They must know what state the data is in and whether this state is sufficient for operation. This means that the data must be continuously monitored by automated systems and the slightest deviations from the target must be reported. The data is monitored for as long as it is still being used. The same applies to the automated processes.

As the processes are constantly being further developed, they must also be constantly tested. The test never ends. With every change, no matter how small, the software must be retested. Like the data status, the process sequences must also be permanently monitored and every deviation from the target behavior must be recorded. Finally, all communication processes must be tracked and all messages checked. Messages that deviate from the standard must be sorted out for closer examination. The content of all messages must be checked against a specification of those messages and the deviation reported.

Digitalization requires products instead of projects

Anyone who dares to enter the digitalization arena must abandon project thinking. Projects are one-off activities that are limited in terms of time and cost and have a beginning and an end. In a digitalized world, there should no longer be any time-limited projects, only products, i.e. systems that are continuously developed, tested and delivered.

Testing takes place as part of the operational handover. Previously, there was always a separation between development, testing and acceptance. There were walls between the activities. In DevOps, the activities flow into one another.

Exploratory testing and use can hardly be separated from each other. The end user is both tester and user. If errors occur, they try to work around them. They are reported in any case, but they must not stop the system from going live, provided they do not prevent further operation.

DevOps promises to be the solution

DevOps promises to be the solution to this problem. As more and more software systems have a dynamic goal in mind, they must never be considered a finished product. They remain in development for life. They only ever cover a subset of the requirements, a subset that becomes smaller and smaller over time. Initially, the system might cover over 90 percent of digitization requirements, but over time this proportion will drop to 80 percent and further down. Software systems have always lost value over time. Belady and Lehman recognized this phenomenon back in the 1970s and recorded it in their laws of software evolution.

A software system is the representation of a certain application, for example travel booking. It becomes obsolete to the extent that it no longer keeps pace with that application, i.e. the travel provider wants to offer more and more services that are not covered by the software. The software falls further and further behind and is of less and less value to the user of the system.

At the time when Belady and Lehman announced their laws of software evolution, the rate of change was still manageable with conventional change management. The new requirements remained below 5 percent of the total requirements. Users submitted change requests for the requirements that were not covered or inadequately covered. These were prioritized and included in the planning of the next releases. Sooner or later, they were implemented. Later, the change rate for more and more systems rose above 10 percent annually. Conventional change management soon reached its limits. The functionality of the software fell further and further behind. The pressure on IT increased more and more.

Now, in the course of digitalization, we have requirement change rates of over 20 percent and more. Conventional change management is completely overwhelmed. More and more applications can no longer be put through a lengthy test procedure before they are released. They have to be released earlier, even if they have not been sufficiently tested. The remaining errors must be removed during production. The damage caused by the errors is less than the damage caused by the delay in delivery. The main thing is that the delivered systems remain under the control of product management. The errors and other defects in the system are all noted and monitored. Sooner or later they will be corrected, but they must never be the reason for not delivering a release. Continuous delivery has priority.

Example of the use of Continuous Delivery

One application in which continuous delivery is practiced is the payroll system of the states of Upper Austria and Burgenland. The administrators in the HR department are also responsible for testing and delivery. The system with a VisualAge frontend and a PL/I backend is being implemented piece by piece in Java. The old environment is the same as the test environment.

For the past year and a half, the front end has been re-implemented component by component in Java. One component after the other is removed from the old system and rewritten in Java. It is then handed over to the responsible administrators, who compare the behavior of the new component with that of the old one.

The clerks test the new Java components alongside their daily work. If deviations occur, they are reported to the developers, who correct them immediately. Until then, the end users work with the old components. The current system is a mixture of new and old components.

By the beginning of 2017, a third of the use cases had been replaced in this way. All front-end components should be in Java by the end of 2018. Until then, the VisualAge and Java components will run side by side in the same environment. The continuity of the service is crucial.

Summary

The digitalization of companies means that their application systems are subject to constant change. They must be constantly developed further in order to keep pace with changes in the real world. Development, testing and delivery take place side by side. This requires different forms of organization. Not all applications are suitable for a DevOps organization - but for more and more front office applications, “time to market” has priority over quality. DevOps is the right solution for these applications.

Modeling Metrics for UML Diagrams

UML quantity metrics Quantity metrics are counts of the diagram and model types contained in the UML model. The model types are further subdivided...

Weiterlesen
Software testing in the future - Interview with Deutschen Bahn

Software testing in the future - Interview with Deutschen Bahn

Bettina Buchholz is Strategic Lead for Quality Assurance & Test at DB Netz AG and is Product Owner of the test-focused CI/CD pipeline MoQ-AP (Modular...

Weiterlesen

What has Object Oriented Technology Achieved?

This article deals with the goals of object orientation and examines the extent to which these goals have actually been achieved. The first...

Weiterlesen