Ushering in third way of analytics

Professor Thomas H. Davenport describes a new, third phase of analytics which combines big and small, structured and unstructured data from multiple sources

What do big companies do with big data? Much of what has been said about big data until now has come from online firms, such as Google, eBay, LinkedIn and Facebook, and start-ups in data-intensive industries. These companies were built around big data from the beginning. No integration with existing architectures or processes was necessary. Big data could stand alone, big data analytics could be the only focus of analytics and big data technology architectures could be the only architecture.

In a research project with Jill Dyche of SAS, however, I studied the big data activities of more than 20 large, well-established businesses, including UPS, Wells Fargo, United Healthcare and other giants. All are doing something with big data, but large-volume, high-velocity and unstructured data in those organisations must be integrated with everything else that’s going on in the company. Overall, we found the expected co-existence; in not a single one of these large companies was big data being managed separately from other types of data and analytics – it’s all blended together.

Some questioned whether big data is even new. The topic may be unfamiliar to much of the world, but many data-focused executives in large firms view it as something they have been wrestling with for years. Some managers appreciate the innovative nature of big data, but more find it “business as usual” or part of a continuing evolution towards more data. However, they are still impressed by the lack of structure of the data they are now able to manage and the opportunity/cost ratio of big data technologies.

There are also continuing, if less dramatic, advances from the usage of more structured data from sensors and operational data-gathering devices. Companies, such as GE, UPS, and Schneider National, are increasingly putting sensors into things that move or spin, and capturing the resulting data to better optimise their businesses. Even small benefits provide a large pay-off when adopted on a large scale.

A 3.0 version is increasingly necessary now that the data-driven economy applies not only to online business, but to virtually any type of firm in any industry

Like many new information technologies, big data can bring about dramatic cost reductions, substantial improvements in the time required to perform a computing task, or new product and service offerings. Like traditional analytics, it can also support internal business decisions. Most of the companies we interviewed had a specific benefit in mind. Each benefit choice has implications for the leadership of the big data initiative and the way that benefits are managed.

Big data introduces some highly specialised technologies that set it apart from legacy systems. These are optimised around the large, unstructured and semi-structured nature of big data. You may have heard about Hadoop, a technique for splitting big data processing across a group of inexpensive servers, but there are other technologies for storing, analysing and reporting results from big data, many of which are open source. Big data technology architectures must increasingly interface with more traditional business intelligence and reporting architectures. There are now multiple options for where to store data for analysis, including commodity server clusters, data discovery environments, data warehouses and marts. These offer varying degrees of data protection and security, as well as different levels of access by users.

As with technology architectures, organisational structures and skills for big data in big companies are evolving, and integrating with existing structures, rather than being established anew. There is no separate “big data department”, instead existing analytics or technology groups are adding big data functions and data science skills to their missions. Data scientists, who excel at extracting and structuring data, are working with conventional quantitative analysts who excel at modelling it. The combined teams are doing whatever is necessary to get the analytical job done. The companies we interviewed have several different approaches to finding scarce data scientists and some are also worried about establishing data-savvy leadership. However, there doesn’t seem to be the data scientist hiring frenzy in these companies that you find in Silicon Valley.

The combination of big data and traditional analytics is leading to a new paradigm for managing analytics, which I call “Analytics 3.0.” Traditional back-office analytics were the 1.0 generation and big data analytics, first pursued by Silicon Valley firms around 2005, constituted Analytics 2.0. A 3.0 version is increasingly necessary now that the data-driven economy applies not only to online business, but to virtually any type of firm in any industry. Some of the other attributes of Analytics 3.0 include:

• Organisations are combining large and small volumes of data, internal and external sources, and structured and unstructured formats to yield new insights in predictive and prescriptive models;

• Analytics are supporting both internal decisions, and data-based products and services for customers;

• Faster technologies, such as in-database and in-memory analytics, are being coupled with “agile” analytical methods and machine-learning techniques that produce insights at a much faster rate;

• Many analytical models are being embedded into operational and decision processes, dramatically increasing their speed and impact;

• Companies are beginning to create “chief analytics officer” roles or equivalent titles to oversee the building of analytical capabilities;

• Tools that support particular decisions are being pushed to the point of decision-making in highly targeted and mobile “analytical apps”;

• Analytics are now central to many organisations’ strategies; a survey I recently worked on with Deloitte found that 44 per cent of executives feel that analytics is strongly supporting or driving their companies’ strategies.

Even though it hasn’t been long since the advent of big data – a decade or so for online firms and less for big companies in other industries – these attributes add up to a new era. Some aspects of this new world will no doubt continue to emerge, but organisations need to begin transitioning now to the new integration of big and small data. The new model means change in skills, leadership, organisational structures, technologies and architectures. Together they comprise the most sweeping change in what we do to get value from data since the 1980s.

Thomas H. Davenport is the President’s distinguished professor at Babson College, Massachusetts, a research fellow at the MIT Center for Digital Business and author of four books on analytics, including Competing on Analytics: The New Science of Winning.