Welcome!

SDN Journal Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Open Source Cloud, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Why Big Data Is More About Variety and Less About Volume

Analytics can dramatically transform enterprises and organizations so long as the focus is on quality rather than quantity

No matter where you turn today, it appears that everyone is speaking about Big Data and how it drives customer insight. Sites like this often have a dedicated section for "Big Data" news, and my inbox is full of whitepapers, webinar invitations, and advertisements for new and exciting Big Data tools from different IT companies.

Companies have been collecting data and analyzing it since the 1950s, but until very recently, the data points collected were very basic, focusing on the who, when and where of consumer behavior but failing to include the nuances of the how and why. People often mistakenly think that the concept of Big Data is all about how much data is collected, not how diverse the data is.

Since the beginning there have generally been three basics kinds of analytics:

  • Predictive analytics focuses on data collected in the past to predict what will probably happen in the future.
  • Descriptive analytics report on consumer activity.
  • Prescriptive analytics utilize models to identify and specify ideal activities and behaviors for the future.

Today's analytics entail all three types and prescriptive analytics is becoming the main focus of enterprises of every kind.

Each of these analytics models involves large-scale testing and optimization, which is where Big Data comes in; not in quantity but in quality, thus enabling data scientists and analyst to best determine how to incorporate the analysis into key processes, placing a huge priority on quality planning and execution, as well as operational benefits.

Old school thinking, on the other hand, is about crunching volumes of customer data to calculate the lifetime of a customer, to understand purchase patterns and demand, and to look at key customer clusters to communicate with them in relevant ways. That's all great, and still relevant, but essentially just part of what any good analytical CRM will do. Big Data, on the other hand, allows companies to analyze the consumers "digital footprint" regardless of whether interaction, interest, or purchasing takes place on line, in a brick and mortar store, at a kiosk, ATM, or through social media commentary. Collection of data points from each of these interactions is what makes data "Big Data." Its ability to store and assimilate information is what is so valuable to behavioral analytics and targeting; this kind of deeply layered information is of great interest to enterprises and their advertisers, as well as governments and public service organizations.

The consumer who does a web search on their home computer for home schooling curriculums, organic products, essential oils, and family camping gear may be a better target for a 0 interest loan on a hybrid SUV than the online consumer who is researching yoga clothes and compact electric cars on her mobile device, but both may be equally terrific candidates for a low cost eco adventure vacation in Costa Rica, depending on other interests that are revealed by collection of data on preferences and interests.

Likewise, areas where consumer internet searches and other interactions at stores, kiosks and ATM machines indicate a higher level of poverty, unemployment or distress enable public service organizations, governments and discount department stores to use data to assist and target accordingly.

Examples of the myriad of uses for Big Data analytics, and the power of such, are infinite and enterprises are learning more every day about how to utilize these data points to best serve th.ir customers.

The big value of Big Data lays not in the data itself, but rather the processing and analysis to gain insight and, as a result of that insight, develop products and services.

The tremendous advances in Big Data technologies and approach to management must be accompanied by comparable shifts in how that data supports decisions around product and service innovation.

CIOs and CEOs of enterprises and organizations alike are looking to the future and seeking the capacity to integrate data storage, assessment, reporting, analytics, security, and reclamation functions on a single platform, thus removing the necessity for complicated programming and specialized skills to join legacy systems together.

As technologies in Big Data advance, more and more tech companies strive to provide solution to these issues.

One such solution to come to the forefront is MongoDB NoSQL database. Often used in conjunction with Hadoop to deliver a powerful Big Data resolution for complex analytics and data processing, the MongoDB-Hadoop connector has enabled enterprises to easily pull data from MongoDB with efficiency and ease-of-use.

There is no doubt that analytics can dramatically transform enterprises and organizations so long as the focus is on quality rather than quantity, and that the richness of the data is explored and employed to make the best use of it.

More Stories By Francesca Krihely

Francesca Krihely is the Community Marketing Manager for MongoDB, the leading NoSQL database. In this role she supports MongoDB community leaders around the globe. She lives in New York City.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Sanjeev Sharma Joins November 11-13, 2018 @DevOpsSummit at @CloudEXPO New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 Cloud Computing Blogger for IT Integrators" by CRN (2015). Mr. Jackson's professional career includes service in the US Navy Space Systems Command, Vice President J.P. Morgan Chase, Worldwide Sales Executive for IBM and NJVC Vice President, Cloud Services. He is currently part of a team responsible for onboarding mission applications to the US Intelligence Community cloud computing environment (IC ...
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the steps so your team's apps not only function but also can be monitored and understood from their machine data when running in production.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.