Welcome!

SDN Journal Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Better Analytics Equals Competitive Advantage

But Cloud and Big Data Complexity are Big Challenges

In the best-selling book Competing on Analytics: The New Science of Winning, authors Thomas H. Davenport and Jeanne G. Harris "found a striking relationship between the use of analytics and business performance...High performers (those who outperformed their industry in terms of profit, shareholder return and revenue growth) were 50 percent more likely to use analytics strategically...and five times as likely as low performers."

Data is the Lifeblood of Analytics
Data is the lifeblood of analytics-the more diverse the better. In their best-selling book, Big Data: A Revolution That Will Transform How We Live, Work, and Think, Mayer-Schonberger and Cukier describe the synergy that occurs when previously unrelated and disparate data is brought together to uncover hidden insights. But these advanced analytics data requirements are a double-edged sword as these more diverse sources complicate data integration and constrain progress.

Different Data Shapes
It used to be the case that most data was tabular, and even relational. But that has changed during the last five years with the rise of semi-structured data from web services and other non-relational and big data streams. Analysts must now work with data in multiple shapes, including tabular, XML, key-value pairs, and semi-structured log data.

Multiple Interfaces and Protocols
Accessing data has gotten more complicated. An analyst used to simply use ODBC to access a database, or receive a spreadsheet via e-mail from a colleague. But now analysts must access data through a variety of protocols, including web services through SOAP or REST, Hadoop data through Hive, and other types of NOSQL data through proprietary APIs.

Big Data
Data sets have grown larger and larger during the last decade, and it is no longer reasonable to assume that all the data can be assembled in one place, especially if that place is your desktop. The rise of Hadoop is fueled by the tremendous amounts of data that can be easily and cheaply stored on this platform. Analysts must be able to work with data by leaving it where it is, and intelligently sub-setting it and combining it with data from multiple sources.

Iterative, Exploratory Methodology
The analytic process is characterized by exploration and experimentation.  Simply finding data is the difficult first step, followed by gaining access.  Then the analyst needs to pull the data together.  This requires data sets to be iteratively assembled and updated as the exploration proceeds. Much of this occurs before building the analytic model and statistically analyzing the model's significance.  In other words, data agility is an important part of successful analytics.

Consolidating Everything No Longer the Solution Everytime
Traditional data consolidation where data is extracted from original sources and loaded onto an analytics data store of some nature remains valid as a core approach. However, what happens when you need to integrate data from the wide array of modern sources to perform a wider, more far-reaching analysis?

For example, if you are trying to analyze marketing campaign effectiveness, your overall analysis requires analytics data from multiple cloud and on-premise data repositories including:

  • Web site clicks in big data Hadoop;
  • Email campaign metrics in on-premise Unica;
  • Nurture marketing metrics in cloud-based Manticore;
  • Lead and opportunity data in cloud-based salesforce.com; and
  • Revenue analysis in on-premise SAP BW.

Does it make sense to create yet another silo that physically consolidates these existing diverse data silos?

Or is it better to federate these silos using data virtualization?

Data Virtualization to the Rescue
Data virtualization
offerings such as the Composite Data Virtualization Platform can help address these difficult analytic data challenges.

  • Rapid Data Gathering Accelerates Analytics Impact - Data virtualization's nimble data discovery and access tools makes it faster and easier to gather together the data sets each new analytic project requires.
  • Data Discovery Addresses Data Proliferation - Data virtualization's data discovery capabilities can automate entity and relationship identification and accelerate data modeling so your analysts can better understand and leverage your distributed data assets.
  • Query Optimization for Timely Business Insight - Data virtualization's optimization algorithms and techniques deliver the timely information your analytics require.
  • Data Federation Provides the Complete Picture - Data virtualization virtually integrates your data in memory to provide the complete picture without the cost and overhead of physical data consolidation.
  • Data Abstraction Simplifies Complex Data - Composite's powerful data abstraction tools simplify your complex data, transforming it from native structures to common semantics for easier consumption.
  • Analytic Sandbox and Data Hub Options Provide Deployment Flexibility -Data virtualization can be configured to support your diverse analytic requirements from ad hoc analyses via sandboxes to recurring analyses via data hubs.
  • Data Governance Maximizes Control - Data virtualization's built-in governance ensures data security, data quality and 7x24 operations to balance business agility with needed controls.
  • Layered Data Virtualization Architecture Enables Rapid Change - Loosely-coupled data virtualization architecture and rapid development tools provide the agility required to keep pace with your ever-changing analytic needs

Conclusion
The business value of analytics has never been greater.  But enterprises are flooded with a deluge of data about their customers, prospects, business processes, suppliers, partners and competitors. Further this data is spread across analyst desktops, big data stores, data warehouses and marts, transaction systems and the cloud.

Data virtualization helps overcome these complexity challenges and fulfills critical analytic data needs significantly faster with far fewer resources than other data integration techniques.

Better analytics equals competitive advantage.  So take advantage of data virtualization.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

CloudEXPO Stories
Despite being the market leader, we recognized the need to transform and reinvent our business at Dynatrace, before someone else disrupted the market. Over the course of three years, we changed everything - our technology, our culture and our brand image. In this session we'll discuss how we navigated through our own innovator's dilemma, and share takeaways from our experience that you can apply to your own organization.
Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, software-defined solution with rich machine intelligence.
Intel is an American multinational corporation and technology company headquartered in Santa Clara, California, in the Silicon Valley. It is the world's second largest and second highest valued semiconductor chip maker based on revenue after being overtaken by Samsung, and is the inventor of the x86 series of microprocessors, the processors found in most personal computers (PCs). Intel supplies processors for computer system manufacturers such as Apple, Lenovo, HP, and Dell. Intel also manufactures motherboard chipsets, network interface controllers and integrated circuits, flash memory, graphics chips, embedded processors and other devices related to communications and computing.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve full cloud literacy in the enterprise world.
Wasabi is the hot cloud storage company delivering low-cost, fast, and reliable cloud storage. Wasabi is 80% cheaper and 6x faster than Amazon S3, with 100% data immutability protection and no data egress fees. Created by Carbonite co-founders and cloud storage pioneers David Friend and Jeff Flowers, Wasabi is on a mission to commoditize the storage industry. Wasabi is a privately held company based in Boston, MA. Follow and connect with Wasabi on Twitter, Facebook, Instagram and the Wasabi blog.