If we look into the future of engineering, development, and construction of vehicles, we don't necessarily have to go for Doc Emmett Brown with his flux capacitor or the legendary Q with his Bond gimmicks (whether a car with a rocket launcher would ever be road legal is questionable anyway). It is enough to try to set a real example: Charles Franklin Kettering.
There's a famous quote of his: "I am very interested in the future because I will spend the rest of my life in it." Kettering was head of development at General Motors for 27 years. He made the electric starter suitable for series production and he is also responsible for electric vehicle lighting. If we are very generous, this makes him one of the forefathers of the E/E platforms. In addition, his credo was that innovations are created by thinking outside the box; today this is likely to be called interdisciplinary cooperation.
The approach in automotive development today tends to be of breaking things down into small pieces in order to then construct individual vehicle components in specialized silos – especially for suppliers – which the OEM then assembles into the finished car, acting as a system integrator. This has been a functional and efficient approach over the last few decades – the art of automotive construction was to hold the many individual threads together to create an innovative car that was ready for market launch on schedule and could easily be adapted to diverse customer requirements.
However, if we take stock of the current situation it is clear that things can't continue like this. Already today, production and development networks based on the division of labor, the abundance of models and variants, or the customer's personalization options create a great deal of complexity in automotive development, which is becoming a challenge for engineering in the face of ever shorter development cycles and limited budgets. Likewise, the influences of the CASE era (connected, autonomous, shared, electrified) and the increase of software dominance in the car are adding to this challenge. New technologies such as artificial intelligence must also be mastered during development – in addition to a much more intensive interaction between all system components. And as the cherry on the cake, there are also extended verification requirements, such as those stemming from increasing environmental requirements – see WLTP tests – or as a result of highly automated vehicles taking over an increasing number of safety-critical driving functions; the keyword is ISO 26262 when it comes to safeguarding these functions.
It is doubtful that the established silo approaches in design and development can cope with these developments. Automobile manufacturers have long since recognized this. They are currently working on introducing new methods, such as systems engineering and new forms of cooperation in vehicle development. But a new methodology only lays the foundation. In total, there are four components with which the engineering of the future must be implemented.
The second component of future engineering marks a turning point away from the focus on applications to a focus on data in engineering IT. The fact is, many specialized IT tools will continue to be needed along the development process in the future. This means that, in this area at least, the future outlook does not fundamentally differ from the current situation, even if the general trend is moving away from large, monolithic IT systems toward smaller tools – known as microservices – which provide the best possible support for a limited range of tasks.
Up to now, product lifecycle management (PLM) in the world of engineering has had the task of creating a bracket for this variety of IT tools. However, most "one-size-fits-all" approaches, in which PLM systems were to be set up for all trades across the board, failed due to the high degree of specialization of the departmental processes and data models. Historically, the available PLM systems were typically too heavily focused on the needs of mechanical development, which led to ALM systems (application lifecycle management) for software areas, for example, becoming established alongside other specialized solutions for electrical and electronic systems, simulation data, etc. For a comprehensive view of the overall system, the more or less loose coupling or synchronization of such systems via various interfaces became necessary, with varying degrees of success and again with their own complexity.
The new approach to creating structure and transparency beyond the legion of applications is a return to data. The approach of the Semantic Web, which goes back to Tim Berners Lee, the inventor of the Internet, was originally to make the Internet more usable for machines by including the meaning of a term via special description languages. For example, a semantic search engine can directly deduce that "Paris" is either a city in France or a Trojan prince, and provide contextual information based on this conclusion (e.g. "French law applies in Paris" or "Paris kidnapped Helen"). In recent years, the Semantic Web approach has therefore become well established in science in order to build and evaluate knowledge networks, for example in disease diagnostics or biology.
Applied to the world of engineering, a Semantic Web which understands all "languages" of the various design and development disciplines can be used to create a uniform data pool, based on the existing application silos and the information contained in them. It is the Babel fish of engineering, so to speak. Anyone involved in the development process can draw from this data pool according to their requirements, and furthermore: with future generations of semantic platforms, the machine readability that now characterizes information and its interrelationships will allow the creation of algorithms and symbolic artificial intelligence that can assist engineers in maintaining the consistency of all information, in controlling projects, or in extracting necessary documentation from the knowledge network.