In 2018 the global digital data volume was estimated at 33 zettabytes – that’s a lot. If you burnt this amount of data to Blu-ray disks, it would make a pile as high as two return journeys to the moon. Quite a nice game of patience and probably a nightmare for the makers of Blu-ray.
The development of automobiles – not just self-driving ones – is also doing its bit to create global data growth: for example, an autonomous automobile produces three terabytes of data in an hour’s drive. Google Waymo cars alone, with their 16 million test kilometers on public roads to date, account for nearly a million TBs, one exabyte. Keyword: Big Data.
The nearer we get to a car being launched in the market, the more urgently necessary driving tests become. Ultimately driving a car is still a physical activity, not a virtual one – even if it has a digital twin. This applies to “normal” automobiles as well – though they don’t produce quite as many bytes. However, plenty of data emerges all the same. But collecting data isn’t even half the battle.
It’s more like a children’s room: if the boys devastate the room, it happens quickly. But when they have to clean it up again, that can take ages. The raw data alone may possibly be valuable (and secret). But its true value only becomes apparent once it’s been calculated using the right models. And to do that requires huge computing capacities. Otherwise the engineer will still be sitting at his computer till the end of time. Huge computing capacities – that cries out for the cloud.