UX is everybody’s business

Professionally and privately computer work and the use of apps and other digital tools have become everyday occurrences. UX ensures the easiest possible operation and focuses on the user experience. This means that digital products are intuitive, reliable and, at best, fun.

Why is a UX strategy important?

The goal of UX is to make the interface between man and machine as comfortable as possible. This includes the “Look & Feel” of the respective tool. It is equally important that the user learns the operation as quickly as possible and can work efficiently. In order to achieve this, it helps to take the user more into account during the development process.

Particularly in the development of complex products, such as business software, many people are often involved in the development – and they all set different priorities. As a result, it is often difficult to control the user experience in the development process.

With the help of a UX strategy, the design of the user experience is given a direction. A focus is set so that product managers, concept developers and developers know what is important in terms of UX and where the journey should take them.

But what does such a UX strategy look like?

Strategy first means formulating a goal and developing an idea of what measures and means should be used to achieve that goal. Established frameworks can help. The UX Strategy Blueprint by UX veteran Jim Kalbach is such a help. We successfully used it in the CONTACT UX team to formulate a UX strategy for the company. In June, at the UXStrat Europe conference, I reported on our experience with this method and also spoke about it in the UXStrat podcast.

The strategy will trigger many innovations in favor of a better UX. For example, using a mockup tool for the first time and testing operating concepts long before the first line of code is written. That’s exhausting at first, but it’s worth it!

How do you live UX? And what does that have to do with me?

A UX strategy alone does not do much good – you have to live it. In addition, it makes sense to involve colleagues from development and product management as early as the strategy development process.

A good user experience is designed at every corner and end, from the platform building block to the form configuration in the customer project. However, UX specialists cannot be involved everywhere. We lead the way, define the strategy and provide support – everyone is called upon to implement it. For us, support means providing colleagues with the right tools, resources and examples. So everyone can independently contribute to a state-of-the-art UX and develop a positive user experience for the user.

And when the end result is powerful and user-friendly products, everyone benefits.

Getting started with IoT in 4 steps

Everyone talks about the Internet of Things (IoT) and the digital twin – they form the framework for new, digital business models. According to a forecast by PwC, digitization will bring the manufacturing industry an increase in turnover of more than 270 billion euros in Germany alone over the next four years.

Companies are hoping for sales growth through smart products and digital business models. This is also confirmed by our current IoT study, which was conducted jointly with the Fraunhofer Institute for Production Systems and Design Technology (IPK) and the Association of German Engineers (VDI). It shows that companies have high expectations, but at the same time makes it clear that there is still a certain reluctance to implement the new legislation in practice. Many companies are faced with the question: “How does it actually work with IoT?”

In my experience, companies often think the second way before taking the first step, which leads to restraint. Of course, it is good and important to have a vision. The picture, which is often published in blogs and forums, usually shows very sophisticated IoT scenarios. They don’t start where many companies currently stand with their business model and technology knowledge.

That’s why it’s important to gain your own experience and gradually approach new digital business models, true to the motto Think big, start small, act now!. Own projects, also together with technology partners, automatically expand the wealth of experience. So why not start using the new technology to support classic business?

With my contribution I would like to show how companies can realize an effective IoT scenario for their business in just 4 steps.

Step 1: The digital twin as a communication interface

The necessary data for the digital twin is usually already available in the company. The first step is a simple serial number. It serves as a documentation interface and connects the data with the product. 3D data is added later. The data is often already available in PLM or ERP systems – for example from production, purchasing or development – and should be displayed in a dashboard.

Step 2: Generate data via sensors

Sensors are also often already available, for example for controlling devices, machines and systems. They record states such as power, pressure, consumption, etc. This data is now consistently recorded and stored suitably. In this way, the current status can always be viewed. In addition, limit values are defined, for example for excessive current consumption, whereupon warnings can be sent and errors rectified.

Step 3: Initiate smart maintenance work

A detailed damage and wear picture can be derived from the analysis of the data and measures such as maintenance projects can be initiated as early as possible. The digital twin serves as a documentation interface. All adjustments to the product thus remain traceable. This data history can later be used for the development of predictions (predictive maintenance). The digital twin as maintained supports the documentation of product changes, can link them with historical data and thus also prove in which configuration the product functions optimally. The classic product lifecycle is thus extended to its usage.

Step 4: Request spare parts

In addition, the information is used to request spare parts. With the help of compressed service parts lists or spare parts catalogs, the data is assigned to the affected component and the required spare part is delivered in the event of imminent damage. This data is also already stored in ERP systems. This process can be triggered manually or automatically on the basis of the device messages. In this way, companies avoid downtime in their own production.

In these 4 simple steps, an efficient IoT scenario has been implemented and a big step towards a digital business model has been taken. I am sure that many companies will be able to get started with the new technology in this way.

So: Get started and use the experience gained for digital business models!

Smart products have their price

IoT failures were the subject of my previous blog post, and what particularly surprised was a “Smoke detector with integrated microphone that allows monitoring in living rooms” that a well-known manufacturer launched on the market. The question of whether this is really a design flaw or whether we don’t have to put up with it for the comfort of smart products resulted in really interesting, sometimes controversial discussions. One question that emerged is not new, but the trade-off generally concerns users of smart devices:

How many and which kind of private data do I disclose for smart comfort?

In the case of the smoke detector, the advantages are obvious: the networking of the smoke detectors in the house offers greater safety in case of fire. If one detector is triggered, all other smoke detectors are informed and the alarm sounds throughout the house. In addition, the alarm can be forwarded, for example to a mobile phone, so that users are informed at all times. This functionality, does not require a microphone that allows monitoring. However, the high-resolution microphone is required if the smoke detector is to be used in addition to voice control for a “smart home”.

Advantage: Then I only have one device on the ceiling: smoke detector with voice control
Disadvantage: I need a smoke detector in every room, but there are rooms where I don’t want any voice control elements to listen.

Maybe during design of the smoke detector this has not been taken into account or simply an existing circuit design was reused. Here it becomes clear that for the development of smart products it is important to look at the whole package from the user’s point of view:

How should a smart product behave, what is technically possible and what should it not be able to do?

For some products it is not clear whether it’s a useful and safe product, like e.g. a jogging stroller that drives autonomously in front of the running track. Is autonomous driving safer than the person holding the stroller? Because he too could stumble and the stroller could roll onto a road …

Furthermore, the Internet of Things and the ongoing digitalization of different areas of life also offer the opportunity to develop sustainable products and solutions. I would like to drive an electric car whose route planner calculates the electric filling stations needed on the way and suggests filling up at a suitable time, naturally taking waiting times into account. Or, in general, smart home applications that save energy and offer greater safety.

Very interesting are also the possibilities in the industrial area, which can be reached by the use of digital twins of plants or machines: Operating states can be recorded at a glance and the machine can be controlled via apps. Algorithms calculate optimal resource allocations, bottlenecks can be detected, and real-time control becomes possible.

The exciting challenge I see in the design of IoT products is the interaction between hardware and software. What possibilities there are to design sustainable and sophisticated products and to optimize processes, if the overall system is considered! Complexity is a big challenge for designer and  developer.  And in addition verification, testing and validation of a solution are required to make sure, that products and systems behave as required.

IoT failures and trust in technology

At the beginning of April this year I attended the building IoT in Cologne. At the conference, which was organized by heise developer, iX and d.punkt publishing house, everything revolved around applications for the Internet of Things (IoT) and Industry 4.0 in lectures and an exhibition. Together with my colleague Yang Zhong, I presented modern user experience concepts (UX) for IoT solutions in a lecture.

At the end of our presentation, which showed a user’s work processes, from the data acquisition of a real “Thing” to the visualization of live data in the dashboard using a Digital Twin, there was a very stimulating discussion. Two points were particularly interesting here:

  • In many application areas, the topic of customer journeys is high on the agenda – which confirms the current trend.  
  • It is essential to develop software for users – which was also a consensus.

The evening was dedicated to Industrial IoT. As a moderator, I hosted a discussion with representatives from various enterprises and software companies, such as Miele, Dürr Dental, Codecentric or akquinet. An intensive discussion around the predominant topics of the industry 4.0 took place here. In addition to the choice of the control electronics or the wireless standard, this also includes questions as to whether an IoT solution should be operated in a cloud. The reasons for solutions in a cloud are of course the convenience and the relatively efficient and simple scalability with regard to the number of “things” to be managed. On the other hand, managing the software on your own servers (on-premise) means that confidential product or customer data really won’t leave your premises. The discussion has confirmed my assessment that both approaches have their advantages in practice and are applied accordingly.

One of my personal highlights at this year’s building IoT was a negative hit list of IoT products, so-called IoT failures: products that have massive security gaps, such as open data interfaces. Some “classic” vulnerabilities were already known, such as unaltered standard passwords that allow data misuse. Others gaps really surprised me: such as a smoke detector of a well-known brand, which is already equipped with a microphone (?!) as standard, which in turn allows unwanted monitoring in living rooms.

Why is there a microphone in a smoke detector?  We can’t say that for sure, at least it’s not in the customer’s interest and causes a massive loss of trust in technology. And that is precisely the point: acceptance of new technologies requires trust. And this is becoming more and more important with increasing digitalization.

20 years of PLM: Why do many still doubt the benefits?

In the meantime, I can look back on several years of consulting for Product Life Cycle Management. A topic whose popularity has fluctuated considerably over the years and is currently on the rise again in the wake of digital transformation.

Despite the increasing attention for PLM again, I notice that the term continues to have a large, cumbersome, tedious, and uneconomical taste. Amazing, because the effort that many companies put into ERP projects, for example, was and is significantly higher in most cases. Nevertheless, the necessity and benefits of – expensive – ERP projects are discussed, but rarely questioned, see Haribo and Lidl.

How do these different perceptions come about? One explanation could be that the benefits of PLM for management and employees in companies have not been sufficiently exploited over the years. This was mainly due to the fact that the scope and visibility of PLM projects in companies was often very limited.

A closer look shows that many of the earlier PLM implementations were in fact PDM implementations. PDM, Product Data Management, focuses on product descriptive data, primarily CAD models and drawings. “PLM” was therefore limited to the core areas of product development, very often even to Mechanical Design. Although beeing avilable in some PLM solutions for years, Change Management, Document Management, Project Management, cross-departmental collaboration or communication with external parties have not been used. Instead, solutions based on Excel, Outlook, the file system or SharePoint were often created on their own. Tools that everyone in the company knows. And for those one can very easily find someone to “optimize” these tools by macro programming. In addition to that, the negative attitude towards PLM was certainly fuelled by the overloaded, highly compressed “engineering user interfaces” of the 1st and 2nd PLM product generations.

So it’s no surprise that PLM was seen in the company as an expensive, less useful and exotic application!

In the current PLM renaissance, companies now have every opportunity to learn from the deficits of the past and to take advantage of the impressive potential of Product Lifecycle Management. Many obsolete and discontinued PDM and PLM solutions are currently or soon to be replaced by modern 3rd generation PLM platforms, which also support the use cases around the Digital Twin and the Internet of Things. They breathe life into the PLM idea by effectively and efficiently supporting processes across phases, departments and company boundaries. New, web-based HTML-5 user interfaces significantly increase acceptance among all user groups in the company by making even complex relationships clearer and handling them more efficient.

Now there is a chance to realize “real” Product Lifecycle Management! Against the background of new, digital business models, which put the use phase of products much more in the foreground, this becomes all the more important. PLM solutions play a central role here, as they lay the foundation for data relating to the Digital Twin.

But in the end, hard facts also count when it comes to benefits and ROI: If PLM is actually used company-wide with all its possibilities, high economies of scale quickly result from the significant minimization of non-value-adding activities. This alone often enables a return on investment after just one year. Regardless of the additional revenue potential from new, data-driven business models that PLM will enable in the future.

The Digital Twin and Quantum Physics

New topics must be sorted with suitable terms. They make communication efficient, because in the best case the sender does not have to explain things from scratch.

The term Product Lifecycle Management was a fairly good way of doing this. You remember: “From the cradle to the grave” and so on. But as the Germans are, they go to the bottom of everything and even deeper. Over the years, there have been plenty of challenging definitions, many of them, which have not helped much.

Here we go again, I thought while reading a recent article The Digital Twin Theory. The authors on the beginnings of their work: “On the other hand, the idea of ‘Digital Twin Theory’ matured during a random contact with quantum physics…: From the point of view of quantum physics, electrons are located in several places simultaneously… It seemed exciting to examine whether these properties could also be assumed for digital twins”.

OK, the freedom of science is a great asset, and original thinkers are in demand. But please don’t be too original. That something is not wrong is not enough, right? It should also be somewhat helpful.

Why the fuss? The Digital Twin is a beautiful, simple picture to understand the potential behind the Internet of Things. It would be a pity if this were lost according to the motto “Why just when you can make it complicated?”

And by the way, the English Wikipedia says: “A digital twin is a digital replica of a … physical entity…”

Are data science platforms a good idea?

According to Karl Valentin: Platforms are beautiful and take a lot of work off your neck. The idea of platforms for automatic data analysis comes at just the right time. In line with this, Gartner has now published a “Magic Quadrant for Data Science and Machine Learning Platforms“. The document itself can only be viewed behind a paywall, but on the net some of the companies mentioned in the report offer access to the document by entering the address.

Gartner particularly emphasizes that such a platform should provide everything you need from a single source, unlike various individual components that are not directly coordinated with each other.

Sounds good to me! However, data science is not an area where you can magically get ahead with a tool or even a platform. The development of solutions – for example, for predictive maintenance of the machines offered by a company – goes through various phases, with cleaning/wrangling and preprocessing accounting for most of the work. In this area, ETL (Extract, Transform, Load) and visualization tools such as Tableau can be ranked. And beyond the imaginary comfort zone of platforms that managers imagine, database queries and scripts for transformation and aggregation in Python or R are simply the means of choice. A look at data science online tutorials from top providers like Coursera underlines the importance of these – well – down-to-earth tools. “Statistical analysis, Python programming with NumPy, pandas, matplotlib, and Seaborn, Advanced statistical analysis, Tableau, machine learning with stats models and scikit-learn, deep learning with TensorFlow” is one of Udemy’s course programs.

In addition, the projects often get stuck in this preliminary stage or are cancelled. There are many reasons for this:

  • no analytical/statistical approach can be found
  • the original idea proves to be unfeasible
  • the data is not available in the quantity or quality you need
  • simple analyses and visualizations are enough and everything else would be “oversized”.

This is no big deal, as it only means that the automated use of Machine Learning and AI does not make a data treasure out of every data set. If, however, the productive benefit becomes apparent, it is necessary to prepare for the production pipeline and time or resource constraints. Usually you start from scratch and reproduce everything again, e.g. in Tensorflow for neural networks or in custom libraries.

The misunderstanding is that a) Data Science can be driven up to productive use without a trace and b) a one-stop-shop for Data Science (here “platform”) is needed that does everything in one go. That will never happen.

This is really good news, because it means that organizations can achieve their first goals without having to resort to large platforms. The reasonably careful selection of suitable tools (many of them open source) helps to achieve this.

Also interesting:
In my video “AI Needs Strategy” I explain which steps companies can take to to use AI technology in a successful way.



The 14 Top Success Patterns of Digital Business Models

Let’s get digital – The Internet of Things (IoT) has an outstanding influence on the relationship between companies and their customers. Companies now face the challenge of placing attractive digital offerings so as not to fall behind. The white paper identifies the central mechanisms of digital offerings and identifies the 14 most important patterns and blueprints for IoT-driven business models.

Market pressure and a new terrain. The markets are becoming digital and smart. Hardly any industry or offer that is not networked and/or in the cloud – at least that’s how it seems. This is undoubtedly a trend that is massively promoted by market-determining players, especially from Silicon Valley. Today, we are all influenced by the use of smartphones and home automation solutions, and we transfer corresponding expectations to other areas as well. The question of “whether” no longer arises, but rather of “how”. According to McKinsey the sales potential for digitized products in the B2B environment is even twice as high as in the B2C sector! Certainly, some phenomena on the market can be accepted as hypes. However, it is also certain that concrete developments and sometimes existential challenges also arise in supposedly firmly established markets:

  • Innovative and established competitors place an offer as “first mover”, attracting attention to themselves from customers for whom digitisation is not yet an issue.
  • New players are breaking into existing markets and placing previously unknown offers on the basis of digitized services.
  • Previously specialized providers (non-providers or providers of secondary services) are expanding their offerings digitally and thus attacking providers in the core market.

The Internet of Things (“IoT”) as a vehicle for digitized product offerings is virtually universal and knows no industry or process boundaries. According to Gartner, this is reflected in “ambitious IoT plans” in a wide variety of industries. Many companies are therefore being forced to confront the potential erosion of their markets by new suppliers.

The challenge lies not only in the high market dynamics, but also in the technical and sales challenges in a partly unknown territory. Many, especially medium-sized companies, lack software know-how, especially if it goes beyond the embedded area. In particular, this includes networked and distributed product architectures or analytics.

Another complicating factor is the fact that suitable personnel is not actually available on the market today. In addition, it is not only about recruiting new employees, but also about building up new business areas. In order to be able to act, companies must invest in completely new alliances and partner models.

The following white paper focuses on the second area of customer service improvement and uses the term “IoT”. The analysis of IoT projects shows that the majority of projects are based on the expansion of a market position in existing markets, i. e. the expansion of the existing product range. Only a few companies approach new markets. In other words, companies generally take a very cautious approach to new business options and try to avoid risks.

Continue reading “The 14 Top Success Patterns of Digital Business Models”

Hat PLM eine Zukunft und wie sieht die aus?

In meinem letzten Blog-Beitrag habe ich darzulegen versucht, warum es aus Anwendersicht so wichtig ist, PLM aus seinem Engineering-Nischendasein zu befreien. Ebenso wichtig ist es aber auch aus Anbietersicht, wenn die PLM-Hersteller sich auf Dauer am Markt behaupten wollen. Darauf hat vor ein paar Monaten Oleg Shilovitsky in einem Gastblog hingewiesen, den ich jetzt erst entdeckt habe. Continue reading “Hat PLM eine Zukunft und wie sieht die aus?”

PLM ist mehr als Produktdatenmanagement

Nein, ich habe das Schießpulver nicht erfunden. Ich weiß auch, dass die Überschrift eine Binsenweisheit ist, die eigentlich jeder kennen sollte. Und doch kann man sie nicht oft genug wiederholen, weil viele PLM-Implementierungen nie oder erst mit jahrelanger Verzögerung die Hürde des Produktdatenmanagements (PDM) nehmen und damit ihrem Anspruch, ein PLM-Projekt zu sein, eigentlich nicht gerecht werden. Continue reading “PLM ist mehr als Produktdatenmanagement”