Finland is a popular destination for trips to the Arctic Circle in winter, temperatures in Death Valley reach as high as 50 degrees Celsius in summer, and test vehicles from various manufacturers in the car industry race through western Namibia on gravel roads: prototypes and endurance test vehicles complete an average of 150,000 kilometers before the new car models go into series production. The overall vehicle system is now so complex that it is not possible to fully simulate all interactions. Although computers, digital twins, and hardware-in-the-loop simulations shorten the test times of new cars, test engineers cannot do without extensive test drives under a wide range of climatic conditions, on differing track surfaces, and with various regional operating differences. When it comes to how systems work together in a real vehicle, the focus is on worldwide test areas.
Car tests in the automotive industry are expensive and complex. Depending on the company, the test run of a prototype can cost up to EUR 200,000 a year – not including the manufacturing costs for the vehicle itself – and despite computer simulation, the engineers still test all vehicles and components of a car in real operation. Test vehicles have long since become mobile measuring laboratories equipped with all kinds of measurement technology. This includes data loggers, among other things, which record all measurement data produced by sensors, actuators, bus systems, and other measuring devices during a car test. The modern sensors of some cars today record up to 10,000 channels. In addition to road signs and passers-by, this now includes the pupil movements of drivers themselves. The car is supposed to detect whether the driver is showing symptoms of fatigue and provide a warning. Vehicle data provide an indication as to whether anything can be optimized in the fine tuning of electric power steering, shock absorbers, or stability programs.
This is where a new big data solution, which we developed at T-Systems, comes into play and combines several effects. First, we compress the data, second, we set up mini data centers at the test tracks, and third, we analyze the data on site.
Data compression has not been easy until now. The test data are signal data whose proprietary data formats first had to be decoded and processed. With our big data signal processing software we now transcode this data into a big data format, which reduces the data volume by up to 90 percent. Without loss of information, of course. On the contrary: the data is enriched with metadata, which includes information such as vehicle data, as well as test track and weather conditions. Furthermore, we process the data in parallel by means of optimized big data storage, which accelerates data processing speeds up to 40 times. This makes it easier for engineers later on to identify and analyze data faster and to merge it with other measurement and metadata.