Getting started with IoT in 4 steps

Everyone talks about the Internet of Things (IoT) and the digital twin – they form the framework for new, digital business models. According to a forecast by PwC, digitization will bring the manufacturing industry an increase in turnover of more than 270 billion euros in Germany alone over the next four years.

Companies are hoping for sales growth through smart products and digital business models. This is also confirmed by our current IoT study, which was conducted jointly with the Fraunhofer Institute for Production Systems and Design Technology (IPK) and the Association of German Engineers (VDI). It shows that companies have high expectations, but at the same time makes it clear that there is still a certain reluctance to implement the new legislation in practice. Many companies are faced with the question: “How does it actually work with IoT?”

In my experience, companies often think the second way before taking the first step, which leads to restraint. Of course, it is good and important to have a vision. The picture, which is often published in blogs and forums, usually shows very sophisticated IoT scenarios. They don’t start where many companies currently stand with their business model and technology knowledge.

That’s why it’s important to gain your own experience and gradually approach new digital business models, true to the motto Think big, start small, act now!. Own projects, also together with technology partners, automatically expand the wealth of experience. So why not start using the new technology to support classic business?

With my contribution I would like to show how companies can realize an effective IoT scenario for their business in just 4 steps.

Step 1: The digital twin as a communication interface

The necessary data for the digital twin is usually already available in the company. The first step is a simple serial number. It serves as a documentation interface and connects the data with the product. 3D data is added later. The data is often already available in PLM or ERP systems – for example from production, purchasing or development – and should be displayed in a dashboard.

Step 2: Generate data via sensors

Sensors are also often already available, for example for controlling devices, machines and systems. They record states such as power, pressure, consumption, etc. This data is now consistently recorded and stored suitably. In this way, the current status can always be viewed. In addition, limit values are defined, for example for excessive current consumption, whereupon warnings can be sent and errors rectified.

Step 3: Initiate smart maintenance work

A detailed damage and wear picture can be derived from the analysis of the data and measures such as maintenance projects can be initiated as early as possible. The digital twin serves as a documentation interface. All adjustments to the product thus remain traceable. This data history can later be used for the development of predictions (predictive maintenance). The digital twin as maintained supports the documentation of product changes, can link them with historical data and thus also prove in which configuration the product functions optimally. The classic product lifecycle is thus extended to its usage.

Step 4: Request spare parts

In addition, the information is used to request spare parts. With the help of compressed service parts lists or spare parts catalogs, the data is assigned to the affected component and the required spare part is delivered in the event of imminent damage. This data is also already stored in ERP systems. This process can be triggered manually or automatically on the basis of the device messages. In this way, companies avoid downtime in their own production.

In these 4 simple steps, an efficient IoT scenario has been implemented and a big step towards a digital business model has been taken. I am sure that many companies will be able to get started with the new technology in this way.

So: Get started and use the experience gained for digital business models!

IoT failures and trust in technology

At the beginning of April this year I attended the building IoT in Cologne. At the conference, which was organized by heise developer, iX and d.punkt publishing house, everything revolved around applications for the Internet of Things (IoT) and Industry 4.0 in lectures and an exhibition. Together with my colleague Yang Zhong, I presented modern user experience concepts (UX) for IoT solutions in a lecture.

At the end of our presentation, which showed a user’s work processes, from the data acquisition of a real “Thing” to the visualization of live data in the dashboard using a Digital Twin, there was a very stimulating discussion. Two points were particularly interesting here:

  • In many application areas, the topic of customer journeys is high on the agenda – which confirms the current trend.  
  • It is essential to develop software for users – which was also a consensus.

The evening was dedicated to Industrial IoT. As a moderator, I hosted a discussion with representatives from various enterprises and software companies, such as Miele, Dürr Dental, Codecentric or akquinet. An intensive discussion around the predominant topics of the industry 4.0 took place here. In addition to the choice of the control electronics or the wireless standard, this also includes questions as to whether an IoT solution should be operated in a cloud. The reasons for solutions in a cloud are of course the convenience and the relatively efficient and simple scalability with regard to the number of “things” to be managed. On the other hand, managing the software on your own servers (on-premise) means that confidential product or customer data really won’t leave your premises. The discussion has confirmed my assessment that both approaches have their advantages in practice and are applied accordingly.

One of my personal highlights at this year’s building IoT was a negative hit list of IoT products, so-called IoT failures: products that have massive security gaps, such as open data interfaces. Some “classic” vulnerabilities were already known, such as unaltered standard passwords that allow data misuse. Others gaps really surprised me: such as a smoke detector of a well-known brand, which is already equipped with a microphone (?!) as standard, which in turn allows unwanted monitoring in living rooms.

Why is there a microphone in a smoke detector?  We can’t say that for sure, at least it’s not in the customer’s interest and causes a massive loss of trust in technology. And that is precisely the point: acceptance of new technologies requires trust. And this is becoming more and more important with increasing digitalization.

20 years of PLM: Why do many still doubt the benefits?

In the meantime, I can look back on several years of consulting for Product Life Cycle Management. A topic whose popularity has fluctuated considerably over the years and is currently on the rise again in the wake of digital transformation.

Despite the increasing attention for PLM again, I notice that the term continues to have a large, cumbersome, tedious, and uneconomical taste. Amazing, because the effort that many companies put into ERP projects, for example, was and is significantly higher in most cases. Nevertheless, the necessity and benefits of – expensive – ERP projects are discussed, but rarely questioned, see Haribo and Lidl.

How do these different perceptions come about? One explanation could be that the benefits of PLM for management and employees in companies have not been sufficiently exploited over the years. This was mainly due to the fact that the scope and visibility of PLM projects in companies was often very limited.

A closer look shows that many of the earlier PLM implementations were in fact PDM implementations. PDM, Product Data Management, focuses on product descriptive data, primarily CAD models and drawings. “PLM” was therefore limited to the core areas of product development, very often even to Mechanical Design. Although beeing avilable in some PLM solutions for years, Change Management, Document Management, Project Management, cross-departmental collaboration or communication with external parties have not been used. Instead, solutions based on Excel, Outlook, the file system or SharePoint were often created on their own. Tools that everyone in the company knows. And for those one can very easily find someone to “optimize” these tools by macro programming. In addition to that, the negative attitude towards PLM was certainly fuelled by the overloaded, highly compressed “engineering user interfaces” of the 1st and 2nd PLM product generations.

So it’s no surprise that PLM was seen in the company as an expensive, less useful and exotic application!

In the current PLM renaissance, companies now have every opportunity to learn from the deficits of the past and to take advantage of the impressive potential of Product Lifecycle Management. Many obsolete and discontinued PDM and PLM solutions are currently or soon to be replaced by modern 3rd generation PLM platforms, which also support the use cases around the Digital Twin and the Internet of Things. They breathe life into the PLM idea by effectively and efficiently supporting processes across phases, departments and company boundaries. New, web-based HTML-5 user interfaces significantly increase acceptance among all user groups in the company by making even complex relationships clearer and handling them more efficient.

Now there is a chance to realize “real” Product Lifecycle Management! Against the background of new, digital business models, which put the use phase of products much more in the foreground, this becomes all the more important. PLM solutions play a central role here, as they lay the foundation for data relating to the Digital Twin.

But in the end, hard facts also count when it comes to benefits and ROI: If PLM is actually used company-wide with all its possibilities, high economies of scale quickly result from the significant minimization of non-value-adding activities. This alone often enables a return on investment after just one year. Regardless of the additional revenue potential from new, data-driven business models that PLM will enable in the future.