Manufacturing industry players produce everything from cars to shirts and ready-made pizzas – not to mention vast volumes of data. To be precise, the world’s manufacturers generated 3.5 zettabytes of data in 2018, with the entire global economy producing 17 zettabytes – a figure with 21 zeros – during the same year. One thing is certain: as digitization continues to gather pace, our planet’s enormous and nebulous pool of data will only grow further. Ever- growing quantities of data are being unearthed from billions of sources, and ever-increasing numbers of business players are benefiting from new ways of accessing this information.
But where and how is data created? In what circumstances can it be used, and for what purposes? What benefits could arise from leveraging this information? These questions can be asked from a broader economic point of view, but also at the level of each and every enterprise. As Dr. Sebastian von Enzberg of the Fraunhofer Institute for Mechatronics System Design (IEM) observes, “Some companies know alarmingly little about where they generate data and what that data is – not to mention what you can do with it.”
Global consultancy firm Roland Berger estimates that 30 billion intelligent sensors will be sold in 2020. These will be used across the manufacturing industry and will generate process data such as workpiece quantities, throughput times and machine status information. However, data will not just be captured from manufacturing equipment; the spectrum will encompass everything from the route recorded by the navigation app of a logistics vehicle, to the documentation of repair and maintenance work, to orders, customer information and development data.
“The first step for any business is to take a data inventory in order to gain an overview and create a data map,” explains Dr. von Enzberg. However, this must be preceded by a shift towards appreciating the true value of data – something that Dr. von Enzberg firmly believes is still lacking in many areas of business. In 2016, according to a McKinsey study, only 15 percent of industrial manufacturing players con- sidered data as part of value creation. And in 50 percent of these companies, data remained completely unused in decision-making.
Once a company has decided to work with its data, the first task is to link together this information from all areas of the business. No matter which aspect of digitization you consider – from networks, to infrastructures such as the cloud, to security – modern technologies such as IoT sensors, networked cyber-physical systems, artificial intelligence and machine learning generate and process vast amounts of data.
This diversity of data sources and quality underlines how the sheer volume of data is not the only challenge that companies must face. Data generated by sensors must be pieced together with handwritten repair protocols, for example. As Dr. von Enzberg notes, “More data does not mean more value; the quality of the data and knowledge of its context is crucial.” And its continuous availability is equally important, too.
It is worth remembering that analyzing data can only have an effect on a company’s business processes if this analysis can be performed seamlessly and without interruptions. After all, connectivity, digitization, infrastructure and security will be of little use if a crucial capability is always missing at a certain point. According to T-Systems CEO Abdel Al-Saleh, “This applies to practically every service or technology we look at and which currently concerns our customers – from cryptography to integration, analytics, multi-cloud, adaptive technology, and data sovereignty.” And that’s without forgetting the overarching need to orchestrate it all. As Al-Saleh adds, “This is another reason why we believe that one of our tasks is to sharpen our customers’ holistic view of information technology and telecommunications. This is because the ‘higher performance’ which we want to offer enterprises is based first and foremost on ‘trusted performance’.”
The IT systems operated by larger businesses only offer limited benefits in this regard. These are typically:
All of these systems fulfil partial functions in business processes but are rarely networked with each other. As Dr. von Enzberg explains, “They represent unconnected data silos and are not designed for big data analysis.”
“Before I begin collecting data, I should ask myself why I want to evaluate it,” Dr. von Enzberg comments. “This is a complex question – and anything but banal.”
In manufacturing companies, data can yield significant benefits in the following areas in particular:
The benefits of data-driven production process transparency can even extend to determining capacities. Are there really companies out there that do not know their produ tion capacity? “Yes,” laughs Dr. von Enzberg. “In the food industry, for example, where customers are supplied ‘just in time’, sufficient reserves must be kept available to ensure deliveries are always of the right quantity. Sometimes, businesses in that industry do not know the actual capacity utilization of a production facility.”
It might seem obvious that data can be lever aged to harmonize a networked production plant. The field of smart maintenance, however, requires rather more thought. As part of the European Union’s Boost 4.0 project, Fraunhofer IEM is collaborating with automotive supplier Benteler and Atlantis Engineering to develop a predictive maintenance system. The system uses data-driven modeling methods – machine learning, in other words – which can detect potential functional defects before they occur. This information would help businesses to prevent machine downtime and interruptions to operations. Ten pilot factories will be built by 2020 within the scope of this project.
The three-year Boost 4.0 project involves the participation of 50 partners from 16 countries, with a focus on the topic of ‘big data for factories’. The project volume includes a subsidy of around 20 million euros from the European Commission, plus 100 million euros of investments made by the participating companies.
Schwering & Hasse, a company which manufactures more than 50,000 metric tons of enameled wire for the electrical industry every year, can point to the real-world benefits of ‘smart quality’. On the one hand, the company needs to perform physical quality checks in order to test the mechanical and electrical properties of its production materials. At the same time, though, production processes need to be interrupted as rarely as possible, as the wire is continuously processed from coils which are several miles long. To solve this dilemma, Schwering & Hasse collected two years of manufacturing data and used this to simulate various test cycles. The company then analyzed these simulations to calculate an optimum test frequency – resulting in a reduction in rejects of up to 14 percent.
Consultancy experts Frost & Sullivan expect big data analytics to increase production efficiency by 10 percent, cut operating costs by almost 20 percent and lower maintenance costs by 50 percent. Roland Berger’s specialists, meanwhile, predict that over the next five years, digitization in manufacturing could generate up to 1.25 trillion euros in additional value in Europe alone.
Businesses may be in a far better position than individuals to leverage data, but consumers still have interesting possibilities of their own. Walter Palmetshofer, an expert from the Open Knowledge Foundation, differentiates between three ways that consumers can use their data: in exchange for service, for a fee, or as a data donation. The Open Knowledge Foundation is a non-profit organization committed to freedom of information and the ethical use of technology. In 2017, it conducted a study on the value of personal data on behalf of the German Federal Ministry of Justice and Consumer Protection.
‘Data for service’ describes a typically unconscious transfer of data; for example, when you book a train ticket online to avoid having to wait in line at the ticket counter, but submit your personal and travel details in the process. As Dr. von Enzberg observes, “Consumers are often unaware of the data aspect. During these booking processes or when using software, many people click ‘OK’ without having read through the small print.”
A well-known example of ‘data for a fee’ is the use of telematics by auto insurers. Consumers are offered lower insurance rates if they share data on how they drive, thereby demonstrating that they drive in accordance with traffic laws. In all of these cases of data exchange, Palmetshofer believes transparency is crucial: “I don’t just need to know who collects my data; I would also like to be able to download this information from the company in machine-readable form.” For example, consumers could use data from the train company to prepare their claims for travel expenses at the end of the year. And for offerings such as telematics-driven auto insurance, Palmetshofer calls for transparency in the rules and regulations used to determine the higher insurance rate which drivers are assigned to if they commit a traffic offense.
Finally, ‘data donations’ can have a particularly significant impact at a society-wide level. For example, consumers can donate their mobility data to support the optimization of traffic management, or offer their health data to organizations conducting medical research. The question of data anonymization is particularly important for these types of donation. As Palmetshofer explains, “The more data I have about an individual, the easier it becomes to identify them. This means that data anonymization must be planned into the process design, right from the start.”
T-Systems develops its cloud-based enterprise solutions in full accordance with this principle, with data privacy built in from the outset. Dr. Claus-Dieter Ulmer, Global Data Privacy Officer at Deutsche Telekom, refers to this approach as ‘security by design’. “We have integrated data protection as a mandatory element of the core process. This means that we can only proceed with a project once its data security has been assured.” A team of ‘privacy champions’ accompanies and oversees all developments, while T-Systems has also set up an external data privacy advisory board which includes members of the Chaos Computer Club and German digital association Bitkom.
When manufacturers utilize their data, there is often a three- way relationship between, for example, the plant manufacturer, the plant operator and the infrastructure service provider. In terms of data security, the latter has a particularly important role within this relationship, as Dr. Ulmer underlines: “We offer encryption via provider shielding, and we are the only ones who have the key.” It is important that customers know this, stresses Dr. Ulmer: “By handling the data in a transparent way, you earn the trust of your customers.”
In addition, the service provider must possess knowledge of the plant operator’s business processes. As Dr. Ulmer explains, “This is the only way we have of knowing which specific security arrangements are necessary, or of being able to develop new ideas for improved, tailor-made security architectures.” T-Systems offers this support in the form of scalable services, with packages to suit customers from mid-sized enterprises to major corporations.
The service provider’s security standards apply uniformly worldwide. “We have an international governance model,” explains Dr. Ulmer. “For example, there is a data privacy officer in every country, even if the respective national laws do not stipulate this.”
And how do the data protection regulations vary from country to country? In the European Union, the General Data Protection Regulation (GDPR) ensures consistent rules apply across all EU member states. However, as Prof. Thomas Riehm, Chair of German and European Private Law, Civil Procedure and Legal Theory at the University of Passau, warns, Brexit could cause issues in this regard. If the United Kingdom withdraws from the EU, it would be regarded as a ‘third country’ to which personal data cannot easily be transferred. For example, the EU-US Privacy Shield applies to the exchange of personal data with the United States. However, the large number of different regulations in force hinders data flows. As Prof. Riehm stresses, “The data market is a global one that works best with consistent, uniform rules.”
And who owns the data that is generated by companies or consumers? “Nobody,” Prof. Riehm explains. “There are no ownership rights for data within the German legal system – unlike in the case of material property, for instance.” For example, when considering the data generated bya ‘connected car’, there is a need to differentiate between levels of information. All data that can be assigned to the driver as an individual – such as GPS coordinates, recorded vehicle speeds and the driver’s identity – is deemed to be ‘personal data’ and is therefore subject to data protection within the scope of GDPR. As Prof. Riehm observes, “Only the person concerned may determine who may process this data for any particular purpose. This is done via a declaration of con- sent under data protection law, which a driver may issue to an insurance provider, for example.” In practice, this could mean giving consent for the use of telematics-driven auto insurance rates.
Purely technical information, such as wear data for individual components of the vehicle, is not considered personal and is therefore not governed by GDPR legislation. As Prof. Riehm notes, “In the first instance, the practical principle is that whoever actually has access to the data can also process it.” However, as it is often virtually impossible to separate this technical data from the personal data, the driver’s consent is required for any further processing.
According to Prof. Riehm, the legal regulations are entirely sufficient for secure data exchange between business partners. “All parties must contractually agree who will be granted which access rights to the data and for what purposes,” he explains. Prof. Riehm notes that it would also make sense for such a contract to contain provisions for the event that a party exceeds its powers, such as penalties for breach of contract, or rights of termination. If a third party obtains unauthorized access to data that is protected from a technical and/or organizational perspective, new German legislation governing the protection of trade secrets (Geschäftsgeheimnisgesetz) would apply, and data espionage would constitute a criminal offence.
As Prof. Riehm emphasizes, “The portability of data – in other words, the ability to transfer the data between systems – is the most important foundation for productive data exchange.” This requires open data formats and interfaces. Non-open proprietary file formats prevent users from migrating data to another provider. “The EU’s new regulation on the free flow of data points in this direction and compels the industry to develop codes of conduct by means of self-regulation, and to implement these codes effectively by May 2020,” Prof. Riehm adds.
IT security is also a vital consideration in this regard, as Prof. Riehm highlights: “As data becomes increasingly integral to value creation processes, businesses will be- come increasingly sensitive to cybercrime. At this stage, awareness of the enormous risks does not seem to have reached every industry to an equal extent.”
Unknown risks, unknown opportunities. To ensure that data is handled in a secure, profitable way, enterprises will need to take a 360-degree view of all key factors. As Dr. Sebastian von Enzberg concludes, “Achieving this comprehensive overview is often a challenge, particularly for small and medium-sized businesses with limited budgets and staff. But it’s worth it!”
More Information: www.t-systems.com