World engulfed by 'data tsunami'

Smartphones and social media are contributing to a deluge of electronic information. Some companies are ill-prepared to cope, with less than 1 per cent of so-called 'big data' currently being analysed.

There are 30 billion pieces of content shared on Facebook every month - contributing to a growing deluge of data. David Paul Morris / Bloomberg News
Powered by automated translation

A "data tsunami" is upon us, with smartphones, tablets and even Web-connected fridges now contributing to the bewildering mass of electronic information being produced.

Humans create 2.5 quintillion bytes of computer data a day - equivalent to the storage capacity of 115 billion 16-gigabyte iPads, or eight times the information housed in all academic libraries in the United States.

But as the amount of electronic information we create multiplies, companies face a shortage of skilled staff who know how to harness the power of this so-called "big data".

Less than 1 per cent of all the data we produce is currently being analysed. Existing tools to make use of this information have become insufficient, leaving many companies ill-prepared to cope with the deluge of data.

This has created the need for new analytical tools to find useful insights buried in a mass of data, say technology experts.

"As the volume and complexity of data barraging businesses from all angles increases, IT organisations have a choice. They can either succumb to information-overload paralysis, or they can take steps to harness the tremendous potential teeming within all of those data streams," says Jeremy Burton, the executive vice-president of product operations and marketing at EMC, a technology company based in the US.

The mass of data we produce is set to grow from the 2.5 quintillion bytes being produced today. One quintillion, for the record, is a number with 18 zeros - and there are seven quintillion grains of sand on all the beaches on Earth.

In the past five years, data volume has risen by 900 per cent and is expected to grow at a rate of 40 per cent every year until 2020.

In fact, we already create so much data each day that it would be physically impossible to store it all permanently. Much of the data being produced is user-generated content. Every time you post a comment on Facebook or a picture on Instagram, the online sphere has new bytes of data to manage. There are 30 billion pieces of content shared on Facebook every month.

Smart devices are also compounding the rise in this so-called "big data" phenomenon.

These include fridges that monitor their contents, cars with in-built satellite navigation and cash tills that provide feedback on stock levels. Machine-generated data is a key driver in the growth of the world's information collection and is projected to increase 15 times by 2020. Deriving meaning from all this unstructured data will be one of the biggest opportunities for technology companies in the coming years. Software tools are required to capture, store, manage and analyse this big data, say executives.

"Analytics can open up new business models for a company or give them new ways of leveraging the information they're already gathering to not only improve their operations, but to improve their services they deliver to their customers," says Mark Dean, the chief technology officer at IBM in the Middle East and Africa.

For example, data from mobile money services can provide the financial services sector an insight into both spending and saving habits across sectors, regions, cities, countries, and various demographics. From this, they can target specific groups with suitable products and services.

Up until now, the current tools and software available have been able to keep up with the amounts of data being produced but this is rapidly changing as more people and devices are becoming connected and generating unprecedented amounts of data.

IDC predicts that big data technology and services will grow from the US$3.2 billion (Dh11.75bn) in 2010 globally to $16.9bn by 2015. The size of this market in India alone is forecast to be $153 million by next year, a time when the developing world will produce more data than the developed world.

One of the biggest hurdles is a shortage of skilled workers in the field. The US alone will need 440,000 to 490,000 data scientists and a further 1.5 million data-literate managers by 2018 to cope with the demand but there is expected to be a shortfall of 140,000 to 190,000 skilled big data professionals in the country.

A recent report from Gartner suggests big data will create 1.3 million IT jobs across Europe, the Middle East and Africa by 2015 with 4.4 million worldwide. But the biggest problem with filling these roles is the lack of skilled workers due to the insufficient training available.

India is perhaps best placed to take advantage of the big data phenomenon. The country has emerged as a top IT outsourcing destination in recent years and Indian technology firms are expected to bring in $1.2bn in data analytics deals by 2015.

In 2011, private equity and venture capital firms invested more than $2bn in data analytics companies, of which about a quarter of the recipient firms had some sort of Indian connection. This year, EMC is planning to train 30,000 scientists for big data work in India alone. While India has the necessary components to benefit from the trend, other countries and regions can also aim to deliver the right solutions.

But as the world looks for meaning in all those quintillions of pieces of information, technology experts point to another concern: the need to ensure consumers' privacy when their data is finally unravelled.