Welcome!

SDN Journal Authors: Elizabeth White, Adrian Bridgwater, Michael Thompson, Michael Bushong, Rachel A. Dines

Related Topics: Big Data Journal, Java, SOA & WOA, Open Source, Cloud Expo, Apache, SDN Journal

Big Data Journal: Article

The Fallacies of Big Data

No software, not even Hadoop, can make sense out of anything

The biggest problem with software is that it doesn’t do us any good at all unless our wetware is working properly – and unfortunately, the wetware which resides between our ears is limited, fallible, and insists on a good Chianti every now and then.

Improving our information technology, alas, only exacerbates this problem. Case in point: Big Data. As we’re able to collect, store, and analyze data sets of ever increasing size, our ability to understand and process the results of such analysis putters along, occasionally falling into hidden traps that we never even see coming.

I’m talking about fallacies: widely held beliefs that are nevertheless quite false. While we like to think of ourselves as creatures of logic and reason, we all fall victim to misperceptions, misjudgments, and miscalculations far more often than we care to admit, often without even realizing we’ve lost touch with reality. Such is the human condition.

Combine our natural proclivity to succumb to popular fallacies with the challenge of getting our wetware around just how big Big Data can be, and you have a recipe for disaster. But the good news is that there is hope. The best way to avoid an unseen trap in your path is to know it’s there. Fallacies are easy to avoid if you recognize them for what they are before they mislead you.

The Lottery Paradox
The first fallacy to recognize – and thus, to avoid – is the lottery paradox. The lottery paradox states that people place an inordinate emphasis on improbable events. Nobody would ever buy a lottery ticket if they based their decision to purchase on the odds of winning. As the probability of winning drops to extraordinarily low numbers (for example, the chance of winning the Powerball is less than 175,000,000 to 1), people simply lose touch with the reality of the odds.

Furthermore, it’s important to note that the chance someone will win the jackpot is relatively high, simply because so many tickets are sold. People erroneously correlate these two probabilities as though they were somehow comparable: “someone has to win, so why not me?” we all like to say, as we shell out our $2 per ticket. Assuming tens of millions of people were to read this article (I should be so lucky!) then it would be somewhat likely that some member of this impressive audience will win the lottery. But sorry to say, it won’t be you.

The same fallacy can crop up with Big Data. As the size of Big Data sets explode, the chance of finding a particular analytical result, in other words, a “nugget of wisdom,” becomes increasingly small. However, the chance of finding some interesting result is quite high. Our natural tendency to conflate these two probabilities can lead to excess investment in the expectation of a particular result. And then when we don’t get the result we’re looking for, we wonder if we’ve just wasted all the money we just sunk into all our Big Data tools.

Another way of looking at the lottery paradox goes under the name the law of truly large numbers. Essentially, this law states that if your sample size is very large, then any outrageous thing is likely to happen. And with Big Data, our sample sizes can be truly enormous. With the lottery example, we have a single outrageous event (I win the lottery!) but in a broader context, any outrageous result will occur as long as your data sets are large enough. But just because we’re dealing with Big Data doesn’t mean that outrageous events are any more likely than before.

The Fallacy of Statistical Significance
Anybody who’s ever wondered how political pollsters can draw broad conclusions of popular opinion based upon a small handful of people knows that statistical sampling can lead to plenty of monkey business. Small sampling sizes lead to large margins of uncertainty, which in turn can lead to statistically insignificant results. For example, if candidate A is leading candidate B by 2%, but the margin of error is 5%, then the 2% is insignificant – there’s a very good chance the 2% is the result of sampling error rather than reflecting the population at large. For a lead to be significant, it has to be a bit more than the margin of error. So if candidate A is leading by, say, 7%, we can be reasonably sure that lead reflects the true opinion of the population.

So far so good, but if we add Big Data to the mix, we have a different problem. Let’s say we up the sample size from a few hundred to a few million. Now our margins of error are a fraction of a percent. Candidate A may have a statistically significant lead even if it’s 50.1% vs. 49.9%. But while a 7% lead might be difficult to overcome in the few weeks leading up to an election, a 0.2% lead could easily be reversed in a single day. Our outsized sample size has led us to place too much stock in the notion of statistical significance, because it no longer relates to how we define significance in a broader sense.

The way to avoid this fallacy is to make proper use of sampling theory: even when you have immense Big Data sets, you may want to take random samples of a manageable size in order to obtain useful results. In other words, fewer data can actually be better than more data. Note that this sampling approach flies in the face of exhaustive processing algorithms like the ones that Hadoop is particularly good at, which are likely to lead you directly into the fallacy of statistical significance.

Playing with Numbers
Just as people struggle to grok astronomically small probabilities, people also struggle to get their heads around very large numbers as well. Inevitably, they end up resorting so some wacky metaphor that inevitably contains an astronomical comparison involving stacks of pancakes to the moon or some such. Such metaphors can help people understand large numbers – or they can simply confuse or mislead people about large numbers. Add Big Data to the mix and you suddenly have the power to sow misinformation far and wide.

Take, for example, the NSA. In a document released August 9th, the NSA explained that:

According to the figures published by a major tech provider, the Internet carries 1,826 Petabytes of information per day. In its foreign intelligence mission, NSA touches about 1.6% of that. However, of the 1.6% of the data, only 0.025% is actually selected for review. The net effect is that NSA analysts look at 0.00004% of the world’s traffic in conducting their mission – that’s less than one part in a million. Put another way, if a standard basketball court represented the global communications environment, NSA’s total collection would be represented by an area smaller than a dime on that basketball court.

Confused yet? Let’s pick apart what this paragraph is actually saying and you be the judge. The NSA claims to be analyzing 1.6% of 1,826 Petabytes per day, which works out to about 29 Petabytes per day, or 30,000 terabytes. (29 petabytes per day also works out to over 10 exabytes per year. Talk about Big Data!)

When they say they select 0.025% (one fortieth of a percent) of this 30,000 terabytes per day for review, what they’re saying is that their automated Big Data crunching analysis algorithms give them 7.5 terabytes of results to process manually, every day. To place this number into context, assume that those 7.5 terabytes consisted entirely of telephone call detail records, or CDRs. Now, we know that the NSA is analyzing far more than CDRs, but we can use CDRs to do a little counter-spin of our own. Since a rule of thumb is that an average CDR is 200 bytes long, 7.5 terabytes represents records of 37 quadrillion (37,000,000,000,000,000) phone calls, or about 5 million phone calls per day for each person on earth.

So, which is a more accurate way of looking at the NSA data analysis: a dime in a basketball court or 5 million phone calls per day for each man, woman, and child on the planet? The answer is that both comparisons are skewed to prove a point. You should take any such explanation of Big Data with a Big Data-sized grain of salt.

The ZapThink Take
Perhaps the most pernicious fallacy to target Big Data is the “more is better” paradox: the false assumption that if a certain quantity of data is good, then more data are necessarily better. In reality, more data can actually be a bad thing. You may be encouraging the creation of duplicate or incorrect data. The chance your data are redundant goes way up. And worst of all, you may be collecting increasing quantities of irrelevant data.

In our old, “small data” world, we were careful what data we collected in the first place, because we knew we were using tools that could only deal with so much data. So if you wanted, say, to understand the pitching stats for the Boston Red Sox, you’d start with only Red Sox data, not data from all of baseball. But now it’s all about Big Data! Let’s collect everything and anything, and let Hadoop make sense of it all!

But no software, not even Hadoop, can make sense out of anything. Only our wetware can do that. As our Big Data sets grow and our tools improve, we must never lose sight of the fact that our ability to understand what the technology tells us is a skill set we must continue to hone. Otherwise, not only are the data fooling us, but we’re actually fooling ourselves.

Image credit: _rockinfree

 

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).

Cloud Expo Breaking News
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver data in a meaningful way, one that really delivers on what end-users need to maintain a competitive position in fast-changing markets.
SYS-CON Events announced today that Gigaom Research has been named "Media Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Ashar Baig, Research Director, Cloud, at Gigaom Research, will also lead a Power Panel on the topic "Choosing the Right Cloud Option." Gigaom Research provides timely, in-depth analysis of emerging technologies for individual and corporate subscribers. Gigaom Research's network of 200+ independent analysts provides new content daily that bridges the gap between breaking news and long-range research.
"We are automated capacity control software, which basically looks at all the supply and demand and running a virtual cloud environment and does a deep analysis of that and says where should things go," explained Andrew Hillier, Co-founder & CTO of CiRBA, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. In his session at Internet of @ThingsExpo, Mac Devine, Distinguished Engineer at IBM, will discuss bringing these three elements together via Systems of Discover.
"In my session I spoke about enterprise cloud analytics and how we can leverage analytics as a service," explained Ajay Budhraja, CTO at the Department of Justice, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.
“We are starting to see people move beyond the commodity cloud and enterprises need to start focusing on additional value added services in order to really drive their adoption," explained Jason Mondanaro, Director of Product Management at MetraTech, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at 15th Internet of @ThingsExpo, Chad Jones, Vice President, Product Strategy of LogMeIn's Xively IoT Platform, will show you how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
All too many discussions about DevOps conclude that the solution is an all-purpose player: developer and operations guru, complete with pager for round-the-clock duty. For most organizations that is not the way forward. In his session at DevOps Summit, Bernard Golden, Vice President of Strategy at ActiveState, will discuss how to achieve the agility and speed of end-to-end automation without requiring an organization stocked with Supermen and Superwomen.
“The Internet of Things is a wave that has arrived and it’s growing really fast. The concern at Aria Systems is making sure that people understand the ramifications of their attempts to monetize whatever it is they build on the Internet of Things," explained C Brendan O’Brien, Co-founder and Chief Architect at Aria Systems, in this SYS-CON.tv interview at the Internet of @ThingsExpo, held June 10-12, 2014, at the Javits Center in New York City. Internet of @ThingsExpo 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading IoT industry players in the world.
“About two years ago Brother launched a new group called Brother Online. We thought it was a good idea for a hardware company to get into the cloud services market and our first step into that market was web conferencing,” explained Courtney Behrens, Senior Marketing Manager at Brother International, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.
"So many teams have been stuck in their own practices for the last ten years and it's really hard to sell all the latest and greatest tools, but the reality is all the tools around operations have evolved so much you are starting to see a big shift toward upgrading that tool set, focusing more on automation, more on cloud," explained Dustin Whittle, Developer Evangelist at AppDynamics, in this SYS-CON.tv interview at the DevOps Summit, held June 10-12, 2014, at the Javits Center in New York City.
“We provide disaster recovery services as well as solutions. We also provide back-up solutions that work across your internal on-premise assets as well as in the public and private cloud," stated Joel Ferman, Vice President of Marketing at InMage Systems, in this SYS-CON.tv interview at the 14th International Cloud Expo® (http://www.CloudComputingExpo.com/), held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.
SYS-CON Events announced today that Harbinger Systems will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Harbinger Systems is a global company providing software technology services. Since 1990, Harbinger has developed a strong customer base worldwide. Its customers include software product companies ranging from hi-tech start-ups in Silicon Valley to leading product companies in the US and large in-house IT organizations.
“Distrix fits into the overall cloud and IoT model around software-defined networking. There’s a broad category around software-defined networking that’s focused on data center, and we focus on the WAN,” explained Jay Friedman, President of Distrix, in this SYS-CON.tv interview at the Internet of @ThingsExpo, held June 10-12, 2014, at the Javits Center in New York City. Internet of @ThingsExpo 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading IoT industry players in the world.
“Dell Cloud Manager is a cloud management environment for the consumption of cloud resources and we provide a consistent interface, both in terms of API and in terms of user interface for doing a wide variety of activities core to deploying and operating software in cloud," explained James Urquhart, Technologist & Director of Cloud Management Solutions at Dell, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading Cloud industry players in the world.