|By Robert Eve||
|February 15, 2013 08:00 AM EST||
Last July, I wrote Data Virtualization Q&A: What's It All About, an ambitious article that attempted to address the topic of data virtualization from numerous angles including use cases, business benefits, and technology.
Since then, with the continued rapid expansion of big data and analytics, as well as data virtualization technology advances, my 360 degree view of data virtualization has evolved.
Data Rich, Information Poor
As I think about data virtualization today, the big data and analytics challenge that data virtualization best addresses is helping enterprises take advantage of their data.
In other words, enterprises today are data rich with loads of enterprise, cloud, third party and Big Data. But they remain information poor.
In this context, let's consider the role of data virtualization with ten, back-to-the-basics questions and answers.
What is Data Virtualization?
Data virtualization is an agile data integration approach organizations use to gain more insight from their data.
Unlike data consolidation or data replication, data virtualization integrates diverse data without costly extra copies and additional data management complexity.
With data virtualization, you respond faster to ever changing analytics and BI needs, fast-track your data management evolution and save 50-75% over data replication and consolidation.
Why Use Data Virtualization?
With so much data today, the difference between business leaders and also-rans is often how well they leverage their data. Significant leverage equals significant business value, and that's a big advantage over the competition.
Data virtualization provides instant access to all the data you want, the way you want it.
Enterprise, cloud, Big Data, and more, no problem!
What Are the Benefits of Data Virtualization?
With data virtualization, you benefit in several important ways.
- Gain more business insights by leveraging all your data - Empower your people with instant access to all the data they want, the way they want it.
- Respond faster to your ever changing analytics and BI needs - Five to ten times faster time to solution than traditional data integration.
- Fast-track your data management evolution - Start quickly and scale successfully with an easy-to-adopt overlay to existing infrastructure.
- Save 50-75% over data replication and consolidation - Data virtualization's streamlined approach reduces complexity and saves money.
Who Uses Data Virtualization?
Data virtualization is used by your business and IT organizations.
- Business Leaders - Data virtualization helps you drive business advantage from your data.
- Information Consumers - From spreadsheet user to data scientist, data virtualization provides instant access to all the data you want, the way you want it.
- CIOs and IT Leaders - Data virtualization's agile integration approach lets you respond faster to ever changing analytics and BI needs and do it for less.
- CIOs and Architects - Data virtualization adds data integration flexibility so you can successfully evolve your data management strategy and architecture.
- Integration Developers - Easy to learn and highly productive to use, data virtualization lets you deliver more business value sooner.
How Does Data Virtualization Work?
Data virtualization's business views provide instant access to the data your business users require, while shielding them from IT's complexity.
- Develop - Your IT staff uses data virtualization's rich data analysis, design and development tools to build the business views (also known as data services).
- Run - When your business users run a report or refresh a dashboard, data virtualization's high-performance query engine accesses the data sources and delivers the exact information requested.
- Manage - Data virtualizations management, monitoring, security and governance functions ensure security, reliability and scalability.
Data virtualization vendor products such as the Composite Data Virtualization Platform provide all these capabilities in a complete and unified offering.
When to Use Data Virtualization?
You can use data virtualization to enable a wide range of information solutions including:
- Agile Analytics and BI Solutions
- Data Warehouse Extension Solutions
- Logical Data Warehouse Solutions
- Data Virtualization Architecture Solutions
- Data Integration and Management Solutions
- Business Solutions
- Industry Solutions
When Not to Use Data Virtualization?
Data virtualization is not the answer to every data integration problem. Sometimes data consolidation in a warehouse or mart, along with ETL or ELT is a better solution for a particular use case. And sometimes a hybrid mix is the right answer.
You can use a Data Integration Strategy Decision Tool to help you decide when to use data virtualization, data consolidation or perhaps a hybrid combination.
What is the Business Case for Data Virtualization?
Data virtualization has a compelling business case. The following drivers make data virtualization a "must have" for any large organization today.
- Profit Growth - Data virtualization delivers the information your organization requires to increase revenue and reduce costs.
- Risk Reduction - Data virtualization's up-to-the-minute business insights help you manage business risk and reduce compliance penalties. Plus data virtualization's rapid development and quick iterations lower your IT project risk.
- Technology Optimization - Data virtualization improves utilization of existing server and storage investments. And with less storage required, hardware and governance savings are substantial.
- Staff Productivity - Data virtualization's easy-to-use, high-productivity design and development environments improve your staff effectiveness and efficiency.
- Time-to-Solution Acceleration - Your data virtualization projects are completed faster so business benefits are derived sooner. Lower project costs are an additional agility benefit.
How to Deploy Data Virtualization?
You can start your data virtualization adoption with specific projects that address immediate information needs.
Which Vendor Should I Select?
If you are like most, you would prefer to go with data virtualization market leader. But how do you define the market leader
Is it the one with the most mature product? For example, one data virtualization vendor has spent a decade delivering nearly 400 man years of R&D, six million lines of code and millions of hours of operational deployment.
Is it the one with the most installations? For example the same vendor is used by nearly two hundred of world's largest organizations
Is it the one with them most domain knowledge? This same vendor's data virtualization thought leadership assets demonstrate the expertise they can bring to bear for you. These include:
- The first book on data virtualization, Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
- Data virtualization's foremost microsite, the DV Café
- The Data Virtualization Leadership Series of analyst reports on data virtualization
- Data virtualization's only dedicated blog, the Data Virtualization Leadership Blog
- The Data Virtualization Channel on YouTube with users, analysts, chalk talks and more
- The Data Virtualization Leadership Awards honoring users
- Data Virtualization Day Resources, assets from the premier events in data virtualization
- Data virtualization's longest running newsletter, Enterprise Information Insight
With so many new opportunities from Big Data, analytics and more, today's challenge is how to take big advantage. This article suggests that data virtualization can be that path, and provides answers to key questions about data virtualization. The time is now.
IBM has acquired Blue Box Group, Inc., a managed private cloud provider built on OpenStack. Customers benefit from the ability to more easily deploy workloads across hybrid cloud environments. Financial details were not disclosed. Enterprises are seeking ways to embrace all types of cloud to address a wide range of workloads. Today's announcement reinforces IBM's commitment to deliver flexible cloud computing models that make it easier for customers to move to data and applications across cloud...
Jun. 3, 2015 07:00 PM EDT Reads: 1,179
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 07:00 PM EDT Reads: 4,894
Health care systems across the globe are under enormous strain, as facilities reach capacity and costs continue to rise. M2M and the Internet of Things have the potential to transform the industry through connected health solutions that can make care more efficient while reducing costs. In fact, Vodafone's annual M2M Barometer Report forecasts M2M applications rising to 57 percent in health care and life sciences by 2016. Lively is one of Vodafone's health care partners, whose solutions enable o...
Jun. 3, 2015 06:30 PM EDT Reads: 3,798
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 06:15 PM EDT Reads: 5,432
ProfitBricks has an early preview of its new Docker hosting platform. The ProfitBricks Docker platform enables its customers to build fully-portable applications within the ProfitBricks cloud. Unlike other Docker platforms, ProfitBricks’ Docker provides developers and system administrators with a platform with dedicated resources that autoscales the Docker hosts. Early access customers will be able to utilize up to 2,500 CPU core hours as part of ProfitBricks’ early access Docker preview.
Jun. 3, 2015 06:00 PM EDT Reads: 1,165
The WebRTC Meetup, where WebRTC enthusiasts exchange ideas, is being held on Wednesday, June 10, from 7 pm – 9 pm at the 4th WebRTC Summit, June 9-11, 2015, at the Javits Center in New York City, NY. The WebRTC Meetup is being hosted by the New York WebRTC Developer Group. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 4th WebRTC Summit continues our tradition of delivering the late...
Jun. 3, 2015 06:00 PM EDT Reads: 1,242
"SAP had made a big transition into the cloud as we believe it has significant value for our customers, drives innovation and is easy to consume. When you look at the SAP portfolio, SAP HANA is the underlying platform and it powers all of our platforms and all of our analytics," explained Thorsten Leiduck, VP ISVs & Digital Commerce at SAP, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 06:00 PM EDT Reads: 4,679
“The year of the cloud – we have no idea when it's really happening but we think it's happening now. For those technology providers like Zentera that are helping enterprises move to the cloud - it's been fun to watch," noted Mike Loftus, VP Product Management and Marketing at Zentera Systems, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 05:15 PM EDT Reads: 4,596
It’s no news that microservices are one of the top trends, if not the top trend, in application architectures today. Take large monolithic applications which are brittle and difficult to change and break them into smaller manageable pieces to provide flexibility in deployment models, facilitating agile release and development to meet today’s rapidly shifting digital businesses. Unfortunately, with this change, application and infrastructure management is more complex due to size and technology c...
Jun. 3, 2015 05:00 PM EDT Reads: 703
"Blue Box has been around for 10-11 years, and last year we launched Blue Box Cloud. We like the term 'Private Cloud as a Service' because we think that embodies what we are launching as a product - it's a managed hosted private cloud," explained Giles Frith, Vice President of Customer Operations at Blue Box, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 05:00 PM EDT Reads: 4,978
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, will investigate three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Jun. 3, 2015 04:19 PM EDT Reads: 752
"NuoDB is a transactionally consistent SQL database that does scale out, that does all the things you want in a cloud. If you want more transactional throughput, if you want higher availability if you want to run in multiple data centers this is a technology that can scale and still provide a single logical consistent database," explained Seth Proctor, CTO of NuoDB, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 04:00 PM EDT Reads: 5,450
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud cre...
Jun. 3, 2015 04:00 PM EDT Reads: 2,072
There will be 150 billion connected devices by 2020. New digital businesses have already disrupted value chains across every industry. APIs are at the center of the digital business. You need to understand what assets you have that can be exposed digitally, what their digital value chain is, and how to create an effective business model around that value chain to compete in this economy. No enterprise can be complacent and not engage in the digital economy. Learn how to be the disruptor and not ...
Jun. 3, 2015 04:00 PM EDT Reads: 1,841
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Jun. 3, 2015 04:00 PM EDT Reads: 3,244
SYS-CON Events announced today that MediaTek Labs will exhibit at SYS-CON's @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. MediaTek Labs is a global ecosystem initiative supporting Wearables and Internet of Things device creation, application development, and services based around MediaTek chipset offerings. Developers can use the MediaTek LinkIt™ ONE development platform and the LinkIt ONE HDK to easily access the features and functions of the...
Jun. 3, 2015 03:45 PM EDT Reads: 1,419
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Jun. 3, 2015 03:45 PM EDT Reads: 2,798
IndependenceIT has been selected by nGenx to power Windows-based DaaS and application delivery on Google Compute Engine to support the delivery of GoldMine Cloud software. For independent software vendors (ISVs) like GoldMine, this expands the theater of operations to increase revenue opportunities while reducing software management and maintenance liabilities. IndependenceIT was selected by application and desktop pioneer, nGenx, to deliver its “Bring Your Own Cloud” strategy to GoldMine and o...
Jun. 3, 2015 03:30 PM EDT Reads: 1,256
“We are a managed services company. We have taken the key aspects of the cloud and the purposed data center and merged the two together and launched the Purposed Cloud about 18–24 months ago," explained Chetan Patwardhan, CEO of Stratogent, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jun. 3, 2015 03:15 PM EDT Reads: 4,331
Platform as a Service vendors in public cloud are mostly focused on the platform for Java, PHP, Python, Ruby and other programming languages. But what about C/C++ and COBOL apps, which continue to be mission-critical for enterprises? Do you bring such apps to the cloud? If so, what are the options for transitioning to cloud? Or would you rather leave such apps on-premises, possibly in a private cloud, and only worry about integrating these applications with new applications in the cloud?
Jun. 3, 2015 03:00 PM EDT Reads: 803