Welcome!

SDN Journal Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, TJ Randall

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Better Analytics Equals Competitive Advantage

But Cloud and Big Data Complexity are Big Challenges

In the best-selling book Competing on Analytics: The New Science of Winning, authors Thomas H. Davenport and Jeanne G. Harris "found a striking relationship between the use of analytics and business performance...High performers (those who outperformed their industry in terms of profit, shareholder return and revenue growth) were 50 percent more likely to use analytics strategically...and five times as likely as low performers."

Data is the Lifeblood of Analytics
Data is the lifeblood of analytics-the more diverse the better. In their best-selling book, Big Data: A Revolution That Will Transform How We Live, Work, and Think, Mayer-Schonberger and Cukier describe the synergy that occurs when previously unrelated and disparate data is brought together to uncover hidden insights. But these advanced analytics data requirements are a double-edged sword as these more diverse sources complicate data integration and constrain progress.

Different Data Shapes
It used to be the case that most data was tabular, and even relational. But that has changed during the last five years with the rise of semi-structured data from web services and other non-relational and big data streams. Analysts must now work with data in multiple shapes, including tabular, XML, key-value pairs, and semi-structured log data.

Multiple Interfaces and Protocols
Accessing data has gotten more complicated. An analyst used to simply use ODBC to access a database, or receive a spreadsheet via e-mail from a colleague. But now analysts must access data through a variety of protocols, including web services through SOAP or REST, Hadoop data through Hive, and other types of NOSQL data through proprietary APIs.

Big Data
Data sets have grown larger and larger during the last decade, and it is no longer reasonable to assume that all the data can be assembled in one place, especially if that place is your desktop. The rise of Hadoop is fueled by the tremendous amounts of data that can be easily and cheaply stored on this platform. Analysts must be able to work with data by leaving it where it is, and intelligently sub-setting it and combining it with data from multiple sources.

Iterative, Exploratory Methodology
The analytic process is characterized by exploration and experimentation.  Simply finding data is the difficult first step, followed by gaining access.  Then the analyst needs to pull the data together.  This requires data sets to be iteratively assembled and updated as the exploration proceeds. Much of this occurs before building the analytic model and statistically analyzing the model's significance.  In other words, data agility is an important part of successful analytics.

Consolidating Everything No Longer the Solution Everytime
Traditional data consolidation where data is extracted from original sources and loaded onto an analytics data store of some nature remains valid as a core approach. However, what happens when you need to integrate data from the wide array of modern sources to perform a wider, more far-reaching analysis?

For example, if you are trying to analyze marketing campaign effectiveness, your overall analysis requires analytics data from multiple cloud and on-premise data repositories including:

  • Web site clicks in big data Hadoop;
  • Email campaign metrics in on-premise Unica;
  • Nurture marketing metrics in cloud-based Manticore;
  • Lead and opportunity data in cloud-based salesforce.com; and
  • Revenue analysis in on-premise SAP BW.

Does it make sense to create yet another silo that physically consolidates these existing diverse data silos?

Or is it better to federate these silos using data virtualization?

Data Virtualization to the Rescue
Data virtualization
offerings such as the Composite Data Virtualization Platform can help address these difficult analytic data challenges.

  • Rapid Data Gathering Accelerates Analytics Impact - Data virtualization's nimble data discovery and access tools makes it faster and easier to gather together the data sets each new analytic project requires.
  • Data Discovery Addresses Data Proliferation - Data virtualization's data discovery capabilities can automate entity and relationship identification and accelerate data modeling so your analysts can better understand and leverage your distributed data assets.
  • Query Optimization for Timely Business Insight - Data virtualization's optimization algorithms and techniques deliver the timely information your analytics require.
  • Data Federation Provides the Complete Picture - Data virtualization virtually integrates your data in memory to provide the complete picture without the cost and overhead of physical data consolidation.
  • Data Abstraction Simplifies Complex Data - Composite's powerful data abstraction tools simplify your complex data, transforming it from native structures to common semantics for easier consumption.
  • Analytic Sandbox and Data Hub Options Provide Deployment Flexibility -Data virtualization can be configured to support your diverse analytic requirements from ad hoc analyses via sandboxes to recurring analyses via data hubs.
  • Data Governance Maximizes Control - Data virtualization's built-in governance ensures data security, data quality and 7x24 operations to balance business agility with needed controls.
  • Layered Data Virtualization Architecture Enables Rapid Change - Loosely-coupled data virtualization architecture and rapid development tools provide the agility required to keep pace with your ever-changing analytic needs

Conclusion
The business value of analytics has never been greater.  But enterprises are flooded with a deluge of data about their customers, prospects, business processes, suppliers, partners and competitors. Further this data is spread across analyst desktops, big data stores, data warehouses and marts, transaction systems and the cloud.

Data virtualization helps overcome these complexity challenges and fulfills critical analytic data needs significantly faster with far fewer resources than other data integration techniques.

Better analytics equals competitive advantage.  So take advantage of data virtualization.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

CloudEXPO Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed by some of the world's largest financial institutions. The company develops and applies innovative machine-learning technologies to big data to predict financial, economic, and world events. The team is a group of passionate technologists, mathematicians, data scientists and programmers in Silicon Valley with over 100 patents to their names. Big Data Federation was incorporated in 2015 and is ...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by researching target group and involving users in the designing process.
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments that frequently get lost in the hype. The panel will discuss their perspective on what they see as they key challenges and/or impediments to adoption, and how they see those issues could be resolved or mitigated.
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.