Is this another industrial revolution?

Big data is transforming industries and working practices. There is an insatiable demand for analysis and visualisation of customer and operational information, and sectors, such as finance, retail, energy and government, are rapidly adapting.

According to IDC analyst Alys Woodward, in Big Data in Europe, British firms had until recently lagged behind their US counterparts in big data adoption, owing to smaller information stores and a skills shortage.

This is changing. “Activities are starting to pick up, driven by the increase in information and awareness of best practices,” she says. Another factor is the rapid improvement in technology.

The report reveals that 20 per cent of end-users in European businesses now have access to big data analysis. Adoption is growing so quickly that 40 per cent of organisations expect to “considerably increase” their adoption this year. Several sectors are leading the way.

The financial industry is moving at a great pace to adopt big data in order to improve business.

James Riley, head of innovation at service provider HCL Technologies, says customer targeting is a key aim, because consumers “leave breadcrumbs of information every moment of the day, most of which can be used to understand them as individuals”.

Financial giant Citi is one company making the most of this. The business has introduced targeted predictive analytics around customer behaviour and service levels.

According to Don Callahan, Citi’s head of operations and technology, in 2009 the bank drew up a data road map, focusing on significant technology and process developments.

“We have largely grown through acquisition, so we’ve worked hard in recent years to standardise technology globally so we have consistent data sources to interpret,” Mr Callahan explains.

The goal of the standardisation is that all Citi’s operations have one format of information, and global patterns can be recognised and addressed.

“Thanks to our analysis, it means as a Citi customer you will be able to go to any of our banks and have the best experience. Each branch will have the same data at the touch of a button,” he says.

Mr Callahan expects big data to provide both “opportunities and challenges” for the sector, “allowing companies to anticipate customer needs, while raising the bar on protecting customer information”.

Big data is also transformative in the health sector, and organisations are implementing systems to manage patients, treat diseases and map long-term life cycles around illness.

What was once a gamble for businesses, led by daring traders and bankers, is now an operational reality

Public Health England attracted great attention with its recent announcement of a national cancer database. The government agency is carefully recording and tracking all different types of cancer life cycles to improve treatment.

The big data store is huge, with 11 million historical records and 350,000 new additions of tumours recorded each year.

Rigorous data formatting is a central factor in deriving useful analytics from the system, and Public Health England has pulled together eight separate regional cancer registries, while employing 200 experts for data quality and standardisation.

The result is impressive. “We are able to track patients along the entire pathway of cancer care, from the first visit of a patient to a doctor, to diagnosis, hospital treatment and recovery,” says Jem Rashbass, national director for disease registration at the organisation.

Armed with the data, clinicians can choose highly targeted treatments based on real patient responses to different cancers at a variety of stages.

“It’s vital that we do not simply treat cancers according to the cancer type alone, but also how each patient responds to treatment,” Mr Rashbass says. “We will now be able to analyse a very large number of cases to deliver personalised medicine.”

The retail industry has “traditionally had a heavy reliance on data and is now trying to use different sources of information or to produce results in real time”, says Seth Robinson, director of technology analysis at CompTIA.

When it comes to real-time analysis, the home delivery sector is ambitious. Paul Clarke, technology director at online groceries firm Ocado, has frequently described the company as “not your typical retailer”. It uses big data for pinpoint tracking of a raft of real-time information on all its delivery vans and the 1.1 million items delivered weekly. Ocado carefully analyses vehicle location, driving styles and petrol consumption.

The company’s systems are sophisticated. Much of the data is drawn out and analysed in milliseconds to aid quick decision-making. The information is also transferred to other internal programs, on which vans are tracked, and a longer-term conclusion is reached on how routes and driving can be improved.

Ocado lauds technology as “the engine of the business” and names the key elements as “optimisation algorithms that fine-tune our daily delivery routes, the machine-learning techniques that drive our consumer demand forecasting and the real-time control systems that operate our vast customer fulfillment centre”.

The company’s data visualisation is imaginative. It uses software based on video-game technology to provide a highly informative and comprehensible display of the warehouse processes taking place.

The oil industry is also reliant upon big data, using the technology to find new sites for successful drilling.

The hunger for information is notable, as the so-called “easy” oil sites dry up and firms are forced to locate resources in remote, difficult locations.

Royal Dutch Shell is spending on average £650 million a year compiling big data to analyse large numbers of sites.

It uses subsurface imaging, while modelling reservoir management, the interaction of chemicals and how hydrocarbons will flow through rocks, all producing highly visual results. Thousands of servers and tens of thousands of high-performance computing cores process the information.

“Drilling a single well offshore can cost up to $100 million (£65 million), therefore you had better be drilling in the right place,” says Arjen Dorland, executive vice president of technical and competitive IT at the company.

In the industry, the presence of oil is typically assessed through seismic sensors and these are usually limited in number, affecting accuracy. Shell is testing the use of millions of seismic sensors simultaneously on any one site.

Predicting that it will hold more than 450 petabytes of data within three years, the company has put “technical and competitive IT” at the heart of the business, says Mr Dorland. Targeted refinements around technology and workflow management are credited with having made Shell’s algorithms around 100 times more effective.

“It’s unique in the oil and gas industry to have technical and competitive IT activities stripped out from general IT services, and put alongside research technology and engineering. It recognises the era of big data,” he says.

Different industries are well aware of the potential in making sense of their big datasets. What was once a gamble for businesses, led by daring traders and bankers, is now an operational reality.

As a result, confidence in new technology is growing strongly, according to Mr Robinson at CompTIA: “Some 93 per cent of companies expect data to be important to the success of their business over the next two years,” he says, “showing that a wide range of solutions will likely be in play.”

Analysis methods may differ by industry, but one thing is clear – they are always geared towards improving processes, winning customers and increasing profit. For as long as big data can be tied to the bottom line, it will remain popular.