The race to develop viable autonomous vehicles has been ongoing for some time, but that race may intensify if today’s new reality leads to reducing risk for passengers and drivers in for-hire vehicles. To get closer to realizing this vision, incredible volumes of test data must be captured and analyzed to gain safety approvals.
A “data filling station” with modern hardware technology and a patent-pending software process provides customers from the automotive sector with a boost for car development. The development process, however, is generating an almost unimaginable volume of data which must be transmitted and analyzed for performance and safety assessments.
For example, the American Society of Automotive Engineer (SAE) has offered a few comparative figures. Even the development of an autonomous car at Level 2 – which just includes parking assistance and adaptive cruise control – requires around 200,000 kilometers of driving distance before it’s ready for the market. This generates four to ten petabytes of sensor test data. The very highest SAE Level 5, which covers fully autonomous vehicles, will require an estimated 240 million kilometers of driving distance before it’s ready for the market, generating a good two exabytes of sensor data, or two trillion bytes. For comparison: According to press reports in 1998, slightly more than one exabyte was equivalent to the sum total of human knowledge.
To shorten time-to-market, it’s important to speed up the process of making these large data volumes manageable. The faster development engineers can analyze the information generated by sensors, cameras, LIDAR (optical radar) scanners and AI control units, as well as the data from external measuring and testing equipment installed in the cars, the faster they can draw their conclusions and make better decisions.
The concrete challenge is to unload the hundreds of gigabytes generated by an electric vehicle equipped with measuring technology during endurance testing in the Arabian desert, within a few minutes during a driver change or refueling.
But the regional unloading of data from tens of vehicles during a shift change is only the first hurdle.
And not only that: Because the regional unloading of many vehicles during a shift change is only the first hurdle; the subsequent provision of data within a complex and increasingly dynamically organized development organization represents a further test bed for every measurement data management system, this applies to the entire industry in particular but also to global players in the automotive industry. The test engineer, who, for example, is in charge of a hybrid powertrain, does not only need the data of a single prototype, but is currently interested in all kickdown operations (= threshold switch on the accelerator pedal to rev up the engine for optimum acceleration) as well as ABS-assisted emergency braking in the entire test fleet comprising several hundred vehicles. And all this as soon as possible after their recording on up to three continents, somewhere between the polar and tropic. “The hub-and-spoke architectures traditionally used for such requirements are hopelessly overtaxed by the bandwidths required for automotive measurement data," says Dr. Christoph G. Jung, Principal Architect in the Digital Solutions unit at T-Systems.
In partnership, T-Systems and Daimler have found a surprising solution to these important challenges. The carmaker's development engineers worldwide can now analyze this gigantic data stream “in situ” and thus in near-real time. The concept is called Edge Computing, which instead of transporting the data to centrally located users, transfers the analysis algorithms to the producers in the field, in this case to the test tracks.
This technology initially combines dedicated cluster hardware based on server computers, as used in T-Systems data centers, with patent-pending signal processing software (“Big Data Signal Processing”) to create a so-called “data filling station”. These regionally located mini data centers are then federated with each other via a global cloud network in such a way that a “Data-As-A-Service” (DAAS) is created, which thus functions as a transparent “data stream can”.
This is very efficient because the raw measurement data is no longer copied to the engineers’ already small laptops via expensive leased lines. Instead, only the engineers’ calculation rules are "injected" as code into the relevant data filling stations installed on the test track. Only the interesting excerpts and results of the algorithms are then transferred, reducing the data volume typically by a factor of a thousand.
In the current showcase project, T-Systems' unique hardware and software was connected to the customer's corporate cloud via the Telekom backbone. There, a wide variety of analysis tools, data enhancement processes, visualizations and also AI algorithms can be cost-efficiently connected to the data stream socket. With this technology, for the first time a means is available to record and evaluate full trace data of all endurance test vehicles in their completeness. As a result, the test process can be accelerated and made much more flexible, virtually a kick-down switch for the entire vehicle development process.