Getting started with IoT in 4 steps

Everyone talks about the Internet of Things (IoT) and the digital twin – they form the framework for new, digital business models. According to a forecast by PwC, digitization will bring the manufacturing industry an increase in turnover of more than 270 billion euros in Germany alone over the next four years.

Companies are hoping for sales growth through smart products and digital business models. This is also confirmed by our current IoT study, which was conducted jointly with the Fraunhofer Institute for Production Systems and Design Technology (IPK) and the Association of German Engineers (VDI). It shows that companies have high expectations, but at the same time makes it clear that there is still a certain reluctance to implement the new legislation in practice. Many companies are faced with the question: “How does it actually work with IoT?”

In my experience, companies often think the second way before taking the first step, which leads to restraint. Of course, it is good and important to have a vision. The picture, which is often published in blogs and forums, usually shows very sophisticated IoT scenarios. They don’t start where many companies currently stand with their business model and technology knowledge.

That’s why it’s important to gain your own experience and gradually approach new digital business models, true to the motto Think big, start small, act now!. Own projects, also together with technology partners, automatically expand the wealth of experience. So why not start using the new technology to support classic business?

With my contribution I would like to show how companies can realize an effective IoT scenario for their business in just 4 steps.

Step 1: The digital twin as a communication interface

The necessary data for the digital twin is usually already available in the company. The first step is a simple serial number. It serves as a documentation interface and connects the data with the product. 3D data is added later. The data is often already available in PLM or ERP systems – for example from production, purchasing or development – and should be displayed in a dashboard.

Step 2: Generate data via sensors

Sensors are also often already available, for example for controlling devices, machines and systems. They record states such as power, pressure, consumption, etc. This data is now consistently recorded and stored suitably. In this way, the current status can always be viewed. In addition, limit values are defined, for example for excessive current consumption, whereupon warnings can be sent and errors rectified.

Step 3: Initiate smart maintenance work

A detailed damage and wear picture can be derived from the analysis of the data and measures such as maintenance projects can be initiated as early as possible. The digital twin serves as a documentation interface. All adjustments to the product thus remain traceable. This data history can later be used for the development of predictions (predictive maintenance). The digital twin as maintained supports the documentation of product changes, can link them with historical data and thus also prove in which configuration the product functions optimally. The classic product lifecycle is thus extended to its usage.

Step 4: Request spare parts

In addition, the information is used to request spare parts. With the help of compressed service parts lists or spare parts catalogs, the data is assigned to the affected component and the required spare part is delivered in the event of imminent damage. This data is also already stored in ERP systems. This process can be triggered manually or automatically on the basis of the device messages. In this way, companies avoid downtime in their own production.

In these 4 simple steps, an efficient IoT scenario has been implemented and a big step towards a digital business model has been taken. I am sure that many companies will be able to get started with the new technology in this way.

So: Get started and use the experience gained for digital business models!

IoT failures and trust in technology

At the beginning of April this year I attended the building IoT in Cologne. At the conference, which was organized by heise developer, iX and d.punkt publishing house, everything revolved around applications for the Internet of Things (IoT) and Industry 4.0 in lectures and an exhibition. Together with my colleague Yang Zhong, I presented modern user experience concepts (UX) for IoT solutions in a lecture.

At the end of our presentation, which showed a user’s work processes, from the data acquisition of a real “Thing” to the visualization of live data in the dashboard using a Digital Twin, there was a very stimulating discussion. Two points were particularly interesting here:

  • In many application areas, the topic of customer journeys is high on the agenda – which confirms the current trend.  
  • It is essential to develop software for users – which was also a consensus.

The evening was dedicated to Industrial IoT. As a moderator, I hosted a discussion with representatives from various enterprises and software companies, such as Miele, Dürr Dental, Codecentric or akquinet. An intensive discussion around the predominant topics of the industry 4.0 took place here. In addition to the choice of the control electronics or the wireless standard, this also includes questions as to whether an IoT solution should be operated in a cloud. The reasons for solutions in a cloud are of course the convenience and the relatively efficient and simple scalability with regard to the number of “things” to be managed. On the other hand, managing the software on your own servers (on-premise) means that confidential product or customer data really won’t leave your premises. The discussion has confirmed my assessment that both approaches have their advantages in practice and are applied accordingly.

One of my personal highlights at this year’s building IoT was a negative hit list of IoT products, so-called IoT failures: products that have massive security gaps, such as open data interfaces. Some “classic” vulnerabilities were already known, such as unaltered standard passwords that allow data misuse. Others gaps really surprised me: such as a smoke detector of a well-known brand, which is already equipped with a microphone (?!) as standard, which in turn allows unwanted monitoring in living rooms.

Why is there a microphone in a smoke detector?  We can’t say that for sure, at least it’s not in the customer’s interest and causes a massive loss of trust in technology. And that is precisely the point: acceptance of new technologies requires trust. And this is becoming more and more important with increasing digitalization.

20 years of PLM: Why do many still doubt the benefits?

In the meantime, I can look back on several years of consulting for Product Life Cycle Management. A topic whose popularity has fluctuated considerably over the years and is currently on the rise again in the wake of digital transformation.

Despite the increasing attention for PLM again, I notice that the term continues to have a large, cumbersome, tedious, and uneconomical taste. Amazing, because the effort that many companies put into ERP projects, for example, was and is significantly higher in most cases. Nevertheless, the necessity and benefits of – expensive – ERP projects are discussed, but rarely questioned, see Haribo and Lidl.

How do these different perceptions come about? One explanation could be that the benefits of PLM for management and employees in companies have not been sufficiently exploited over the years. This was mainly due to the fact that the scope and visibility of PLM projects in companies was often very limited.

A closer look shows that many of the earlier PLM implementations were in fact PDM implementations. PDM, Product Data Management, focuses on product descriptive data, primarily CAD models and drawings. “PLM” was therefore limited to the core areas of product development, very often even to Mechanical Design. Although beeing avilable in some PLM solutions for years, Change Management, Document Management, Project Management, cross-departmental collaboration or communication with external parties have not been used. Instead, solutions based on Excel, Outlook, the file system or SharePoint were often created on their own. Tools that everyone in the company knows. And for those one can very easily find someone to “optimize” these tools by macro programming. In addition to that, the negative attitude towards PLM was certainly fuelled by the overloaded, highly compressed “engineering user interfaces” of the 1st and 2nd PLM product generations.

So it’s no surprise that PLM was seen in the company as an expensive, less useful and exotic application!

In the current PLM renaissance, companies now have every opportunity to learn from the deficits of the past and to take advantage of the impressive potential of Product Lifecycle Management. Many obsolete and discontinued PDM and PLM solutions are currently or soon to be replaced by modern 3rd generation PLM platforms, which also support the use cases around the Digital Twin and the Internet of Things. They breathe life into the PLM idea by effectively and efficiently supporting processes across phases, departments and company boundaries. New, web-based HTML-5 user interfaces significantly increase acceptance among all user groups in the company by making even complex relationships clearer and handling them more efficient.

Now there is a chance to realize “real” Product Lifecycle Management! Against the background of new, digital business models, which put the use phase of products much more in the foreground, this becomes all the more important. PLM solutions play a central role here, as they lay the foundation for data relating to the Digital Twin.

But in the end, hard facts also count when it comes to benefits and ROI: If PLM is actually used company-wide with all its possibilities, high economies of scale quickly result from the significant minimization of non-value-adding activities. This alone often enables a return on investment after just one year. Regardless of the additional revenue potential from new, data-driven business models that PLM will enable in the future.

Are data science platforms a good idea?

According to Karl Valentin: Platforms are beautiful and take a lot of work off your neck. The idea of platforms for automatic data analysis comes at just the right time. In line with this, Gartner has now published a “Magic Quadrant for Data Science and Machine Learning Platforms“. The document itself can only be viewed behind a paywall, but on the net some of the companies mentioned in the report offer access to the document by entering the address.

Gartner particularly emphasizes that such a platform should provide everything you need from a single source, unlike various individual components that are not directly coordinated with each other.

Sounds good to me! However, data science is not an area where you can magically get ahead with a tool or even a platform. The development of solutions – for example, for predictive maintenance of the machines offered by a company – goes through various phases, with cleaning/wrangling and preprocessing accounting for most of the work. In this area, ETL (Extract, Transform, Load) and visualization tools such as Tableau can be ranked. And beyond the imaginary comfort zone of platforms that managers imagine, database queries and scripts for transformation and aggregation in Python or R are simply the means of choice. A look at data science online tutorials from top providers like Coursera underlines the importance of these – well – down-to-earth tools. “Statistical analysis, Python programming with NumPy, pandas, matplotlib, and Seaborn, Advanced statistical analysis, Tableau, machine learning with stats models and scikit-learn, deep learning with TensorFlow” is one of Udemy’s course programs.

In addition, the projects often get stuck in this preliminary stage or are cancelled. There are many reasons for this:

  • no analytical/statistical approach can be found
  • the original idea proves to be unfeasible
  • the data is not available in the quantity or quality you need
  • simple analyses and visualizations are enough and everything else would be “oversized”.

This is no big deal, as it only means that the automated use of Machine Learning and AI does not make a data treasure out of every data set. If, however, the productive benefit becomes apparent, it is necessary to prepare for the production pipeline and time or resource constraints. Usually you start from scratch and reproduce everything again, e.g. in Tensorflow for neural networks or in custom libraries.

The misunderstanding is that a) Data Science can be driven up to productive use without a trace and b) a one-stop-shop for Data Science (here “platform”) is needed that does everything in one go. That will never happen.

This is really good news, because it means that organizations can achieve their first goals without having to resort to large platforms. The reasonably careful selection of suitable tools (many of them open source) helps to achieve this.

Also interesting:
In my video “AI Needs Strategy” I explain which steps companies can take to to use AI technology in a successful way.



The 14 Top Success Patterns of Digital Business Models

Let’s get digital – The Internet of Things (IoT) has an outstanding influence on the relationship between companies and their customers. Companies now face the challenge of placing attractive digital offerings so as not to fall behind. The white paper identifies the central mechanisms of digital offerings and identifies the 14 most important patterns and blueprints for IoT-driven business models.

Market pressure and a new terrain. The markets are becoming digital and smart. Hardly any industry or offer that is not networked and/or in the cloud – at least that’s how it seems. This is undoubtedly a trend that is massively promoted by market-determining players, especially from Silicon Valley. Today, we are all influenced by the use of smartphones and home automation solutions, and we transfer corresponding expectations to other areas as well. The question of “whether” no longer arises, but rather of “how”. According to McKinsey the sales potential for digitized products in the B2B environment is even twice as high as in the B2C sector! Certainly, some phenomena on the market can be accepted as hypes. However, it is also certain that concrete developments and sometimes existential challenges also arise in supposedly firmly established markets:

  • Innovative and established competitors place an offer as “first mover”, attracting attention to themselves from customers for whom digitisation is not yet an issue.
  • New players are breaking into existing markets and placing previously unknown offers on the basis of digitized services.
  • Previously specialized providers (non-providers or providers of secondary services) are expanding their offerings digitally and thus attacking providers in the core market.

The Internet of Things (“IoT”) as a vehicle for digitized product offerings is virtually universal and knows no industry or process boundaries. According to Gartner, this is reflected in “ambitious IoT plans” in a wide variety of industries. Many companies are therefore being forced to confront the potential erosion of their markets by new suppliers.

The challenge lies not only in the high market dynamics, but also in the technical and sales challenges in a partly unknown territory. Many, especially medium-sized companies, lack software know-how, especially if it goes beyond the embedded area. In particular, this includes networked and distributed product architectures or analytics.

Another complicating factor is the fact that suitable personnel is not actually available on the market today. In addition, it is not only about recruiting new employees, but also about building up new business areas. In order to be able to act, companies must invest in completely new alliances and partner models.

The following white paper focuses on the second area of customer service improvement and uses the term “IoT”. The analysis of IoT projects shows that the majority of projects are based on the expansion of a market position in existing markets, i. e. the expansion of the existing product range. Only a few companies approach new markets. In other words, companies generally take a very cautious approach to new business options and try to avoid risks.

Continue reading “The 14 Top Success Patterns of Digital Business Models”

Hat PLM eine Zukunft und wie sieht die aus?

In meinem letzten Blog-Beitrag habe ich darzulegen versucht, warum es aus Anwendersicht so wichtig ist, PLM aus seinem Engineering-Nischendasein zu befreien. Ebenso wichtig ist es aber auch aus Anbietersicht, wenn die PLM-Hersteller sich auf Dauer am Markt behaupten wollen. Darauf hat vor ein paar Monaten Oleg Shilovitsky in einem Gastblog hingewiesen, den ich jetzt erst entdeckt habe. Continue reading “Hat PLM eine Zukunft und wie sieht die aus?”

Data Science verstehen – Revolutionäres Potential aus vier Megatrends!

 Wir befinden uns an der Schwelle zu einem neuen Zeitalter, weil verschiedene Strömungen zusammenkommen und damit ein einzigartiges Umfeld schaffen. Vieles (manche würden sagen: alles) wird digital. Damit ist auch das Interesse an den Themen Datenanalyse und -exploration – also Data Science – enorm gestiegen. Data Science ist der Konvergenzpunkt von vier Megatrends, die die letzten Jahren dominiert haben und auch die kommenden dominieren werden: Cloud Computing, IoT, Big Data und algorithmische Analyse.

Was sind die Gründe für das Zusammenkommen verschiedener Strömungen und damit eines neuen, einzigartigen Umfeldes?

  1. Zum ersten Mal in der Geschichte der Künstliche Intelligenz, die in den 1950er Jahren als Disziplin begonnen hat, steht die notwendige Rechenleistung zu niedrigen Kosten zur Verfügung, um praktische Probleme mit den schon länger verfügbaren Algorithmen zu lösen.
  2. Die Algorithmen für das Machine Learning sind deutlich verbessert worden und können nun mit vertretbarem Aufwand für praktische Probleme eingesetzt werden.
  3. Die Popularität von Data Science trägt dazu bei, seine Methoden aus den akademischen Zirkeln in die Breite zu tragen, so dass eine große experimentierfreudige Community eine rapide Weiterentwicklung fördert.
  4. Heutzutage gibt es vor allem durch das Internet, die sozialen Netzwerke und die großen Einkaufsplattformen einen Datenschatz in nie gekannter Größenordnung, der auf seine Auswertung wartet.
  5. Das Internet der Dinge wird für weitere Datenströme sorgen, die zu neuen Geschäftsmodellen führen, die mit Hilfe von Data Science erschlossen werden.

Diese Faktoren haben dazu beigetragen, Data Science als eigene wissenschaftliche Fachdisziplin und Ergänzung zur klassischen Statistik zu etablieren. Data Scientist mit ihren Fähigkeiten im Bereich Programmierung, Statistik und neuerer Algorithmik bringen die erforderliche Expertise mit, um die heutigen Möglichkeiten der Datenanalyse gewinnbringend zu nutzen. Die verschiedenen Data Science Techniken lassen sich nach algorithmischen Verfahren oder nach dem Einsatzzweck grob so kategorisieren:

  • Regression
  • Klassifikation
  • Anomalienerkennung
  • Clustering
  • Reinforcement Learning

Auf der einen Seite der bestehenden Software-Landschaft gibt es bereits sehr spezifische Lösungen für gut umrissene Probleme, zum Beispiel im Einzelhandel oder in der Finanzindustrie. Am anderen Ende des Spektrums stehen die Anbieter von Software-Paketen, die ein abgestimmtes Toolset für den Spezialisten im Bereich Data Science zur Verfügung stellen.

Die meisten Lösungen basieren dabei auf Open Source Software. Im Bereich Toolsets dominieren vor allem zwei Sprachen den Markt: R and Python. Python hat sich zur Standardsprache für Data Scientists entwickelt, vor allem im Bereich Machine Learning.

Die gewaltigen Investitionen und die anziehenden Umsätze von großen Commodity-Plattformen wie Amazon, Microsoft und Google zeigen: Die Megatrends Cloud Computing, IoT, Big Data und algorithmische Analyse bestimmen bereits heute oder in naher zukunft die Geschäftsprozesse, und dass bis in die letzten Winkel. Für Unternehmen, die dieses Thema näher interessiert, hat CONTACT ein neues Data Science White Paper herausgebracht. Dies kann hier heruntergeladen werden.