What is Quantum Computing good for?

When it comes to quantum computing (QC), after the quite real breakthroughs in hardware and some spectacular announcements under titles like “Quantum Supremacy“, the usual hype cycle has developed with a phase of vague and exaggerated expectations. I would like to briefly outline here why the enormous effort is being made in this area and what realistic expectations lie behind it.

To understand the fundamental differences between QC and Classical Computing (CC), we first need to take a step back and ask on what basis both computing paradigms operate. For the CC, the basis is the universal Turing machine expressed in the ubiquitous von Neumann architecture. This may sound a bit outlandish, but in principle it is easy to understand: An universal Turing machine abstracts the fact of programming any algorithm into a classical computer (universal) that is somehow (classically) algorithmically expressible (Turing machine).

The vast majority of “algorithms” that are implemented in practice are simple sequences of actions that react to external events such as mouse clicks on a web page, transactions in a web store or messages from other computers in the network. A very very small, but important, number of programs do what is generally associated with the word algorithm, which is to perform arithmetic operations to solve a mathematical problem. The Turing machine is the adapted thought model for programming these problems and leads to programming languages having the constructs we are used to: loops, branches, elementary arithmetic operations etc.

What is the computing paradigm for a quantum computer?

A quantum computer is built up of quantum states that can be entangled with each other and evolved via quantum gates. This is also a bit off the wall, but simply means that a quantum computer is set to have an initial (quantum) state that evolves in time and is measured at the end. The paradigm for a quantum computer is therefore the Schrödinger equation, the fundamental equation of quantum mechanics. Even without understanding the details, it should be clear that everyday problems are difficult to squeeze into the formalism of quantum mechanics and this effort probably does not bring any profit: Quantum mechanics is just not the adjusted model of thought for the most (“everyday”) problems and it is also not more efficient in solving them.

So what can you do with it?

The answer is very simple: QC is essentially a method for quantum computing. Now this sounds redundant, but it means that a quantum computer is a universal machine to calculate quantum systems. This vision, formulated by Richard Feynman way back in 1981, is still followed by the logic of research today. Thus, it is not surprising that publications on the subject dealing with applications are located either in quantum chemistry or in the basic research of physics [5][6].

Why does this matter?

Because the classical computer is very inefficient in calculating or simulating quantum systems. This inefficiency is basically due to the mathematical structure of quantum mechanics and will not be solved by classical algorithms, no matter how good they are. In addition to basic research issues, QC is likely to become important in the hardware of classical computers, where miniaturization is pushing the limits of designing transistors on chips using classical theories of electricity. 

Besides, there are a lot of interesting connections to number theory and other various problems, which so far can be classified as interesting curiosities. Based on current knowledge, the connection to number theory alone could have a significant impact, because for historical reasons almost all practical asymmetric encryption schemes rely on algorithms that essentially assume (there is no proof) that prime number factorization cannot be solved efficiently with classical algorithms. Quantum computers can do this in principle but are far away from being able to do so in terms of hardware.

Developer Experience – from intuitive to complex

It sounds like an exciting vision of the future: users from every discipline can use ready-made program modules to quickly and easily create simulations, optimization tasks or analyses using artificial intelligence (AI). This can then also be implemented by departments whose employees do not have knowledge of a high-level programming language. That’s the idea. Of course, developers must first create these program modules so business users can assemble a solution that meets their requirements.

AI-powered analytics for the business department

Together with our partners, we are researching in the AI marketplace project to get closer to this vision. The name-giving goal is to develop AI applications in the field of the product development process and offer them on a central trading platform. The range will also include services such as seminars on selected AI topics or contract development as well as ready-made AI-supported apps and program blocks for very specific tasks. The development and reuse of the apps are currently being tested. The project team is evaluating the benefits and quality of the results at the same time.

Different programming levels for extended use

So that’ s the state of research, but how exactly do we at CONTACT support the development of reusable program modules, the integration of simulation models or AI-supported analysis methods? One example of practical application can be found in the area of predictive maintenance. Predictive maintenance means that maintenance periods do not take place at fixed intervals as before, but are calculated depending on operating data and events at the machine or plant. For such use cases, our Elements for IoT platform provides a solution to analyze operating data directly. The digital twin stores the data of the machine or plant in a unique context. This data can be directly retrieved and easily analyzed using block-based programming. With the no-code functionality of the IoT platform, departments can intuitively create digital twins, define automatic rules and monitor events, and create diagrams and dashboards – without writing a line of code.

In addition, there are applications around the Digital Twin that require more programming expertise. For this, the platform offers analysts the possibility to develop their models themselves in a higher programming language using a Jupyter Notebook or other analysis tools. Especially in the area of prototyping, Python is the language of choice. However, it is also possible to work with a compiler-based programming language such as C++. Continuous calculation of the predictions is then done by automating the models, which are available in a runtime environment. The code is executed either in the company’s own IT infrastructure or directly at the plant or machine in the field (edge).

We call this procedure low-code development, because only the code for developing the models is written. The data connection is made via the Digital Twin and is done configurationally. The piece of program code can then be reused as a program block for various applications, such as digital twins within a fleet.

CONTACT Elements for IoT is thus open to interactions at different levels: from the use of predefined building blocks (no-code), to the possibility of interacting with self-written program code (low-code), to the definition of own business objects and the extension of the platform based on Python.

The Digital Twin at the Center of Renewable Energy

According to the German Wind Energy Association (BWE), the share of wind energy in German electricity production this year is 27 percent, and in 2020 wind energy even represented the most important energy source in the German electricity mix. In total, more than 31,000 turbines have been installed, saving 89 million tons of CO2 equivalent in 2019. Wind power is thus a mainstay of low-CO2 and sustainable energy generation and makes an important contribution to the energy transition. Further increasing yields while reducing maintenance costs is therefore of great importance.

Increasing the efficiency of wind farms with smart systems

Digital Twins are the central element in exploiting the full potential of wind power and maximizing yields. Driven by the vision of creating a data-based development tool for the wind industry, the WIND IO joint project, funded by the German Federal Ministry for Economic Affairs and Energy, started a year and a half ago.

Under the leadership of the Institute for Integrated Product Development BIK at the University of Bremen, we are working with several consortium partners to build research facilities as cyber-physical systems and retrofit them with sensors, electronics and computers known as IoT gateways. This makes it possible to digitally map all the operating information of the real plant and combine it on a digital twin. The operating behavior can be simulated on the basis of the Digital Twin, which in turn provides insights for further optimization of the wind turbine. The Digital Twin not only provides information about the current energy yield, but also offers a comprehensive overall picture of the condition of each individual turbine.

Improved installation, maintenance and overhaul processes

The information obtained can be used, for example, to optimize maintenance and overhaul processes. For example, the data makes the aging process of components transparent at all times and automatically triggers an alarm if defined limit parameters are exceeded. The Digital Twin also uses the operating, environmental and weather data collected to determine a favorable time for maintenance of the plant. Ideally, this should be carried out when there is little wind, so as not to be at the expense of energy generation.

Both statistical methods and Artificial Intelligence (AI) models are used for the calculations. These methods also help to determine the best time to assemble a wind turbine, since the rotor blades can only be installed under certain conditions. For this purpose, in addition to weather data, additional parameters such as the vibration of the tower are included in the calculations.

Digital Twins for a sustainable industry 

The WIND IO project vividly demonstrates the potential of digitization and especially the concept of the Digital Twin. In addition, companies can use their data to simulate entire production and operating cycles. This makes it possible to minimize resource consumption, reduce energy consumption and at the same time coordinate production steps more effectively and optimize transport routes. Concepts such as the Digital Twin and data-intensive analysis methods are thus essential for a gentle and efficient industry.