Big Data Challenge 1: Tackling Staggering Complexity

Sutherland Editorial

A version of this article was first published on

Complexity and big data are inherently linked. Three out of every five data scientists surveyed spend most of their day cleaning and organizing data according to a recent survey conducted by CrowdFlower.

That complexity is matched only by the complexity of the technology needed to handle it, and the algorithms required to analyze it. The famous components of big data’s intrinsic complexity are its volume, velocity and variety. They are not themselves complex, but their interactions engender the intrinsic complexity.

But even after you’ve conquered the hardwired complexity of the technology to handle the data, your data is useless unless you’ve got the algorithmic complexity to dig out the insights buried within it. If you lack the sophisticated data science and human talent needed to extract trends, patterns, and predictions from your data, it has no business value.

Sutherland's Chief of Analytics - Phani Nagarjuna’s - outlines a five-step method to help you break through the complexity and make the most out of your big data:

Methodology: Implement an age-old method to deal with the volume, variety and velocity of big data—break down its big, problematic complexity into smaller parts. To do this, focus on one business problem at a time. By narrowing your focus, you turn a big data problem into smaller data problems that are more manageable.

Machine: Once you’ve identified your goal and the small corresponding criteria of data, leverage the power of machine automation to achieve greater speed to insights related to a specific business challenge or goal. An organization can leverage machine automation to do 60-70 percent of the heavy lifting on repeatable and scalable processes.

Manpower: Let the intelligence and experience of skilled human resources translate and apply findings to meet business objectives. Human intelligence is still the best way of working with the elements that are context dependent. As McKinsey Quarterly notes, the machine can provide the pieces, but human synthesis is necessary to assemble the puzzle.

Repeat: By implementing methodology that focuses on a particular business problem, capturing relevant and actionable data using the machine, and contextualizing the results with skilled manpower, the organization can begin again, taking on the next problem.

Don’t lose sight of the big picture: By dealing with one business problem at a time, the intrinsic complexity of big data—volume, variety and velocity—becomes manageable.

The next part of this series, “Corporate Consumerization,” will examine how a consumer market orientation of analytics technologies, allows organizations to provide actionable insights and extract immediate value.