Series: Big Data January 17, 2013
Transforming mass data into successes
Volume, variety, velocity and analysis
The gigantic 12 terabytes which are generated through Twitter feeds every day, five million global share trades per second, and more and more videos, photos and other unstructured data. These facts speak for themselves clearly enough – the Big Data trend is taking shape. However, it's not just a matter of getting a grip of this mass of data, but also about its speed and variety. If companies control these three factors, internal analytical expertise is crucially important in order to find the correct key figures, that is, to recognize patterns, significant points and contexts.
Here is the good news: There are already companies which have successfully deciphered the Big Data code and which have got a grip of the data explosion – such as Macy's. With revenue of almost USD 30 billion and a network of 800 locations, Macy's is considered the largest store operator in the USA. Nevertheless, the 180,000 employee-strong company manages to run a daily price check analysis of its 10,000 articles in less than two hours. That is all the more remarkable since Macy's also has high price volatility within the company. This means: Whenever a neighboring competitor anywhere between New York and Los Angeles goes for aggressive price reductions, Macy's follows its example. If there is no market competitor, the prices remain unchanged. Thus there are around 270 million different prices across the entire range of goods and locations. The fact that this price analysis is possible at record speed – unthinkable before Big Data analysis – is something that Macy's CIO Larry Lewark owes to switching its existing infrastructure to a cloud-provided software solution from SAS and to the use of in-memory technology. In this way, Macy's can even adjust its prices several times on the same day to react better to local competition.
Bank controls the risk
The Big Data success concept for the United Overseas Bank (UOB) in Singapore is also a high performance analysis solution from SAS, combined with in-memory technology. Here the skillful analysis of large data volumes has made a significant contribution to the 45 percent rise in the share price of UOB in recent years because the large financial institute in south-east Asia is a master of risk management. In detail: At UOB, the risks are divided between 45,000 different financial instruments and are set by using more than 100,000 market parameters such as prices, deadlines and due dates. The calculation of the overall risk assumes around 8.8 billion individual, highly complex value at risk calculations. However, in order to use these to examine the effects of the market on the total risks of the bank, IT recently required up to 18 hours. Thus a prompt reaction to newly arising market risks was impossible.
Thanks to Big Data analysis, the experts at UOB have now succeeded in cutting the risk calculation time down to only a few minutes, and in taking account of rapidly changing parameters in the complex analysis almost in real time. Whereas up to now risk analysis has sometimes been perceived as an irritating task that had to be performed at the behest of the supervisory authorities, UOB can today use the instrument to assist its operating business. This allows UOB to test trading strategies in advance and to assess the likely effect of new market events more quickly.
You can read about other examples of successfully established Big Data strategies in the Best Practice Big Data issue.