What is Quantum Computing good for?

When it comes to quantum computing (QC), after the quite real breakthroughs in hardware and some spectacular announcements under titles like “Quantum Supremacy“, the usual hype cycle has developed with a phase of vague and exaggerated expectations. I would like to briefly outline here why the enormous effort is being made in this area and what realistic expectations lie behind it.

To understand the fundamental differences between QC and Classical Computing (CC), we first need to take a step back and ask on what basis both computing paradigms operate. For the CC, the basis is the universal Turing machine expressed in the ubiquitous von Neumann architecture. This may sound a bit outlandish, but in principle it is easy to understand: An universal Turing machine abstracts the fact of programming any algorithm into a classical computer (universal) that is somehow (classically) algorithmically expressible (Turing machine).

The vast majority of “algorithms” that are implemented in practice are simple sequences of actions that react to external events such as mouse clicks on a web page, transactions in a web store or messages from other computers in the network. A very very small, but important, number of programs do what is generally associated with the word algorithm, which is to perform arithmetic operations to solve a mathematical problem. The Turing machine is the adapted thought model for programming these problems and leads to programming languages having the constructs we are used to: loops, branches, elementary arithmetic operations etc.

What is the computing paradigm for a quantum computer?

A quantum computer is built up of quantum states that can be entangled with each other and evolved via quantum gates. This is also a bit off the wall, but simply means that a quantum computer is set to have an initial (quantum) state that evolves in time and is measured at the end. The paradigm for a quantum computer is therefore the Schrödinger equation, the fundamental equation of quantum mechanics. Even without understanding the details, it should be clear that everyday problems are difficult to squeeze into the formalism of quantum mechanics and this effort probably does not bring any profit: Quantum mechanics is just not the adjusted model of thought for the most (“everyday”) problems and it is also not more efficient in solving them.

So what can you do with it?

The answer is very simple: QC is essentially a method for quantum computing. Now this sounds redundant, but it means that a quantum computer is a universal machine to calculate quantum systems. This vision, formulated by Richard Feynman way back in 1981, is still followed by the logic of research today. Thus, it is not surprising that publications on the subject dealing with applications are located either in quantum chemistry or in the basic research of physics [5][6].

Why does this matter?

Because the classical computer is very inefficient in calculating or simulating quantum systems. This inefficiency is basically due to the mathematical structure of quantum mechanics and will not be solved by classical algorithms, no matter how good they are. In addition to basic research issues, QC is likely to become important in the hardware of classical computers, where miniaturization is pushing the limits of designing transistors on chips using classical theories of electricity. 

Besides, there are a lot of interesting connections to number theory and other various problems, which so far can be classified as interesting curiosities. Based on current knowledge, the connection to number theory alone could have a significant impact, because for historical reasons almost all practical asymmetric encryption schemes rely on algorithms that essentially assume (there is no proof) that prime number factorization cannot be solved efficiently with classical algorithms. Quantum computers can do this in principle but are far away from being able to do so in terms of hardware.

Lasting communication instead of talk and forget

Today I had a task to complete that I had long postponed due to other pressing matters. The team urged me to finally finish my part so that we could complete the implementation of a function in the software. The logic to be clarified was rather demanding and I struggled to get back into it. My to-do card on the task board was vague and my recollection of the initial meeting had already faded. So I started researching and found an email that refreshed my memory.

Overabundant communication channels lead to information loss

Oh, good old email. What a surprise, since I have much more modern communication tools at hand. Our team is spread all over Germany and we have been accustomed to working remotely, even before the corona pandemic. This works just fine for us, with the help of agile rituals such as the daily standup meeting and communication via video conferencing and chat. Successful communication is essential for the success of the project, as is well known.

It is also convenient that notes and documents can directly be shared within the online meeting. The catch is, however, that the chat often ends up containing decisions and technical information. Hence, the information I was looking for could just as well have been found there. Apart from email and chat, a surprising number of companies also have a third potential location for finding information: network drives, where project documents are stored in a more or less structured manner.

Communication in context means finding instead of searching

Have fun searching! Luckily, we at CONTACT have it much easier. We use our own software for project management, which provides us with excellent tools to do things better. In addition to the project management functionality, these include document management and a communication functionality called Activity Stream.

Posts in the Acitivity Stream – and this is the key point – can always be assigned to an object. For example, to a project, a task, or an open item. Or, in the case of our customers, to product data such as a CAD model, a bill of materials, or simulation data. This links project and product data to the relevant communication activities. For one thing, this allows us to search and find information in one single tool. In addition, because the object serves as an anchor point for the associated communication, all context-relevant information is automatically displayed when the object is called up.

Enrich objects with information en passant

Back to my case: To clean up the mess, I attached a document with my solution to the completed task. Along with it, I added the email that helped me do it. I also created and linked a new task for implementation by my colleagues, then wrote a summarizing Activity Stream post and shared it with them.

Now, I have brought together what belongs together. Even if a team member unfamiliar with the project history takes over the implementation, all information is immediately at hand. He or she can ask a question via the Activity Stream, without having to explain the context in an email or chat first. If I had used the Activity Stream to communicate within the task’s context from the beginning, all relevant information would have been assembled there. And I would have saved myself the trouble of researching and combining it.

Changing habits pays off

So, what do we learn from this? Firstly: A project management system with document management and context-related communication à la Activity Stream improves collaboration enormously. Secondly: It takes some discipline not to fall back to other tools at the first opportunity – as I did. But it saves a lot of work later on.

Time and time again, I see customers hesitating to switch from email to this type of contextual communication. My simple advice: Have the courage! Provide your employees with an appropriate tool and advocate for it. It may be unfamiliar at first, and it takes some time to gain widespread acceptance. But it is worth it. For the entire organization as well as the individual employee!

Developer Experience – from intuitive to complex

It sounds like an exciting vision of the future: users from every discipline can use ready-made program modules to quickly and easily create simulations, optimization tasks or analyses using artificial intelligence (AI). This can then also be implemented by departments whose employees do not have knowledge of a high-level programming language. That’s the idea. Of course, developers must first create these program modules so business users can assemble a solution that meets their requirements.

AI-powered analytics for the business department

Together with our partners, we are researching in the AI marketplace project to get closer to this vision. The name-giving goal is to develop AI applications in the field of the product development process and offer them on a central trading platform. The range will also include services such as seminars on selected AI topics or contract development as well as ready-made AI-supported apps and program blocks for very specific tasks. The development and reuse of the apps are currently being tested. The project team is evaluating the benefits and quality of the results at the same time.

Different programming levels for extended use

So that’ s the state of research, but how exactly do we at CONTACT support the development of reusable program modules, the integration of simulation models or AI-supported analysis methods? One example of practical application can be found in the area of predictive maintenance. Predictive maintenance means that maintenance periods do not take place at fixed intervals as before, but are calculated depending on operating data and events at the machine or plant. For such use cases, our Elements for IoT platform provides a solution to analyze operating data directly. The digital twin stores the data of the machine or plant in a unique context. This data can be directly retrieved and easily analyzed using block-based programming. With the no-code functionality of the IoT platform, departments can intuitively create digital twins, define automatic rules and monitor events, and create diagrams and dashboards – without writing a line of code.

In addition, there are applications around the Digital Twin that require more programming expertise. For this, the platform offers analysts the possibility to develop their models themselves in a higher programming language using a Jupyter Notebook or other analysis tools. Especially in the area of prototyping, Python is the language of choice. However, it is also possible to work with a compiler-based programming language such as C++. Continuous calculation of the predictions is then done by automating the models, which are available in a runtime environment. The code is executed either in the company’s own IT infrastructure or directly at the plant or machine in the field (edge).

We call this procedure low-code development, because only the code for developing the models is written. The data connection is made via the Digital Twin and is done configurationally. The piece of program code can then be reused as a program block for various applications, such as digital twins within a fleet.

CONTACT Elements for IoT is thus open to interactions at different levels: from the use of predefined building blocks (no-code), to the possibility of interacting with self-written program code (low-code), to the definition of own business objects and the extension of the platform based on Python.