Saturday, October 4, 2014

Getting value from Big Data requires the same discipline as any other business investment.

Reports on the surface might lead us to believe that the Big Data "revolution" brings alchemy in our time. A look below the surface suggests however that the path to Big Data value is not without challenges.  For example:

  • Information Week recently reported that "the average company right now is getting a return of about 55 cents on the dollar" invested in Big Data analytics.  
  • A related survey by Wikibon found that "46% of Big Data practitioners report that they have only realized partial value from their Big Data deployments," while, An unfortunate 2% declared their Big Data deployments total failures..." 
  • Information Week also summarized a survey by IDG and Kapow Software revealing that, "...just 23% of <business> leaders see big data projects as 'successes' thus far."
Moreover, organizations find that even measuring Big Data return on investment (RoI) remains challenging.

Gartner's "Emerging Technologies" Hype Cycle Report places Big Data at the Peak of Inflated Expectations. The consultancy also predicts that five to ten years remain before Big Data reaches the "Plateau of Productivity" stage. This positioning is telling.


What then are the obstacles to big-data value? Analysts suggest that talent scarcity, technology maturity, and less than perfect alignment between Big Data initiatives and business priorities lie among the culprits.  I focus here on thee other barriers:
  • Failure to treat business issues scientifically;
  • Limitations in data itself; and
  • Cultural disinclinations to fact-based decision making.

Failure to treat business issues scientifically.

Big data simply represents the latest wave of business technology. Authors of the IT Infrastructure Library™² observed the influence of organizations' technology management style on their ability to derive value. Organizations that focus on technology's strategic contribution get more out of it. Ross, et al, made similar observations.  Technology can either "fossilize" an organization, or it can provide a platform for "dynamic capabilities."⁴

The same is true for analytics.  Lavalle, et al,⁵ published what remains one of the more useful roadmaps to Big Data value.  Davenport, et al,⁶ echo similar concepts. Three of Lavalle's five central recommendations apply here:
  • First, think biggest — Focus on the highest-value opportunities; and
  • Start in the middle — Within each opportunity, start with questions, not data; and
  • Build the parts, plan the whole — Define an information agenda to plan for the future.
Lavalle and Davenport separately portray analytics — in not so many words — as a scientific approach addressing business challenges.

What is the state of the practice of Big Data analytics today?  A twenty-year-old issue of Scientific American⁷  gives us a thought model. Technology professions proceed through stages of maturity from craft, to commercialization, to professionalism.  The emergence of standard, repeatable practices represents a key milestone.

Sets of best practices for Business Analytics have begun to emerge. Figure 1 summarizes four of them.  Three are data-centric methods: Patterns for applying the scientific methods to data. Two of these were developed by vendors to aid in use of their analytics software. The third originated from academia.  Davenport¹¹ describes a fourth, business-centric method.

Figure 1 — A series of best practices methods for data analytics have emerged.  The data-centric methods at best address business problems only obliquely.

What does this say about the state of the Business Analytics profession? Many of the leading practices — only incidentally at best — focus analysts on business problems. Would-be adopters of Big Data analytics apply scientific approaches to data, but not always necessarily to business questions they are asked to answer.

How does lack of scientific approaches to business impede RoI from Big Data investments? Figure 2 illustrates. Haphazard treatment of business priorities by analytics practitioners leaves organizations in the "Sandbox" or "Science Project" modes.


Figure 2 — RoI from Big Data analytics investments depends on scientific treatment of business as well as the data.

Getting value from Big Data depends on scientific treatment of the business as well as the dataOrganizations' due diligence in their Big Data investments demands the the scrutiny and discipline as for any other major business investment. No organization would launch a marketing strategy without first considering how it intends to differentiate itself. Investing in Big Data demands the same focus.

Limitations in the data themselves.

A recent contributor to Harvard Business Review postulates that "...advanced algorithms can take a near-unlimited number of factors into account...."¹² This perception about Big Data walks a tenuous fine line.  The temptation to attribute alchemic abilities to Big Data analytics sometimes appears pervasive. Fundamental limitations underly data science — on which Big Data analytics is based — in the same way that weight and drag constrain aerodynamics.

But what about the Big Data "revolution"? Experiences with and perceptions of consumer data mining might produce misleading intuition. Consumer analytics is not a haphazard "search for serendipity" in a random "stack" of Big Data. Our consumer devices and applications are carefully instrumented to collect precisely the information that predicts our spending behaviors.

What kind of data limitations can impede analytics RoI?  This blog has already considered two. I illustrated how simply adding more data does not increase the information analytics can produce. Adding more factors to a model only helps if they contain useful information that the previous ones lacked.

Also, the lack of data quality degrades analytics outputs.  Analytics models are subject to the "Garbage-in/Garbage-out" syndrome.  I looked at five data quality attributes that influence the usefulness of analytics outputs. Neglecting data quality potentially lands organizations in the "Lies, damn lies, and statistics" quadrant of Figure 2.

Finally, analytics may at best yield information that is already known. Organizations may not need Big Data to improve operational optimization, for example. Responding to competitive market forces may have already driven them to the optimize their operations. Analytics would — under such a scenario — simply validate that the organization has already achieved optimality.

What then do we do about our data?  We adopt an approach that applies the scientific method jointly to our business and our data. A disciplined, scientific method leads us to the truth about the value of opportunities hidden in our data. It also reveals new information-collection needs.

Cultural disinclinations to fact-based decision making.

Lavalle⁵ observes that "the adoption barriers that face most organizations are managerial and cultural rather than related to data and technology."⁵ Deploying technology is the investment side of the Big Data RoI equation. Driving return often involves changing the business.

The Data Warehouse Institute — a an organization of technical professionals dedicated to Enterprise Information Management — surveyed its membership about the closely related imperative of aligning management of data to strategic business priorities.¹³ The first technology related issue did not appear until fifth place among their responses.  The four highest-ranking responses were:
  • Data ownership and other politics;
  • Lack of governance or stewardship;
  • Lack of business sponsorship; and
  • Unclear business goals for data.
Effectively executed Big Data analytics — through joint scientific management of the business as well as the data — can provide the "What" part of answers to pressing business questions. I illustrated by showing how customer analytics might provide the basis for a targeted marketing campaign.

The "So What" part of business questions' answers addresses how the organization responds. Information's economic value is based on the value of the opportunity arising from its optimum application. Optimally applying information often requires the people in an organization to change how they operate. They must change what they measure about their business, and what they do with those measurements. These changes are scary and hard!

Kotter¹⁴ may in the end tell us as much about driving RoI from Big Data as Hastie¹⁵ or any other technical text. Harvesting analytics' value depends as much on changing an organization's behavior as it does on tools and technology. 



References

¹ Gartner Group's 2013 installation of "Hype Cycle for Emerging Technologies,"  http://goo.gl/a4xlEY.
² ITIL™ Service Operation, U.K. Office of Government Commerce, 2007, Figure 5.1, p. 81.
³ J. W. Ross, Peter Weill, D. C. Robertson, Enterprise architecture as strategy, Boston:  HBR Press, 2006, http://goo.gl/B7J5P8
⁴ C. E. Helfat, et al, Dynamic capabilities, Wiley, 2007, http://goo.gl/Gn6cjC.
⁵ S. Lavalle, et al, "Big data, analytics, and the path from insights to value," MITSloan Management Review, Winter 2011, http://goo.gl/8RSn5H.
⁶ T. H. Davenport, J. G. Harris, and R. Morison, Analytics at work, Boston: HBR Press, 2010, http://goo.gl/olZkKm.
⁷ W. W. Gibbs, "Software's chronic crisis," Scientific American, September 1994, pp. 86 - 95.
⁸ IBM SPSS Modeler CRISP-DM Guide, IBM Corporation, http://goo.gl/4Gg7Pa.
⁹ Enterprise Miner™ SEMMA Method, SAS Corporation, http://goo.gl/8ig4RX.
¹⁰ "The KDD Process for Extracting Useful Knowledge from Volumes of Data," Communications of the ACM, November 1996, http://goo.gl/s4gvDd.
¹¹ T. H. Davenport, "Keeping up with your quants," Harvard Business Review, July-August 2013, http://goo.gl/BrWpD1.
¹² T. C. Redmond, "Algorithms make better predictions — Except when they don't," HBR Blog Network, September 17, 2014, http://goo.gl/n0kPJd.
¹³ "TDWI Technology Survey: Enterprise Data Strategies," Business Intelligence Journal, Vol 18, No. 2, March 2013.
¹⁴ J. P. Kotter, Leading change, Boston: HBR Press, 1996, http://goo.gl/EqglOJ.
¹⁵ T. Hastie, R. Tibsharini, and J. Friedman, Elements of statistical learning, NY: Spinger, 2009, http://goo.gl/23tclz.

© The Quant's Prism, 2014

No comments:

Post a Comment