Welcome!

SDN Journal Authors: Elizabeth White, Liz McMillan, Jignesh Solanki, Destiny Bertucci, Daniel Gordon

Related Topics: Containers Expo Blog, Microservices Expo, @CloudExpo, Cloud Security, @DXWorldExpo, SDN Journal

Containers Expo Blog: Article

What to Do About the Data Silo Challenge

Using Data Abstraction to Bring Agility to BI and Analytics

Organizations today understand that better access to information assets can improve their bottom-line.

But they struggle with the variety of enterprise, cloud and big data sources, and all their associated access mechanisms, syntax, security, etc.  Further, few data sources are structured properly for business user or application consumption, let alone reuse.  And often the data is incomplete or duplicated.

Data Abstraction Addresses These Challenges
Data abstraction overcomes data source to data consumer incompatibility by transforming data from its native structure and syntax into reusable views and data services that are easy for application developers and business analysts to understand and consume.

Data Abstraction Technology Options
Some data abstraction approaches used today work better than others.

For example, some organizations build data abstraction by hand in Java or use business process management (BPM) tools.  Unfortunately, these are often constrained by brittleness and inefficiencies.  Further, such approaches are not effective for large data sets since they lack the robust federation and query optimization functions required to meet data consumers' rigorous performance demands.

Data warehouse schemas can also provide data abstraction.  Data modeling strategies for dimensions, hierarchies, facts and more are well documented.  Also well understood is the high cost and lack of agility in the data warehousing approach.  Further, data warehouse based schemas don't include the so many new classes of data (big data, cloud data, external data services and more) that reside outside the data warehouse.

Data Virtualization Is a Superior Solution for Data Abstraction
Data virtualization
is an optimal way to implement data abstraction at enterprise scale.  From an enterprise architecture point of view, data virtualization provides a semantic abstraction or data services layer supporting multiple consuming applications.  This middle layer of reusable services decouples the underlying source data and consuming solution layers. This provides the flexibility required to deal with each layer in the most effective manner, as well as the agility to work quickly across layers as applications, schemas or underlying data sources change.

Data Abstraction Reference Architecture
The diagram below outlines the layers that form Composite Software's Data Abstraction Reference Architecture.  Architects and analysts can use as a guide when building a data abstraction layer using Composite's data virtualization platform.  This various layers included in this reference architecture are described in Figure 1.

  • Data Consumers - Client applications want to retrieve data in various formats and protocols. They want to receive the data in a way that they understand. Data abstraction allows the consumers to format the data according to their specifications and deliver over various transport protocols including: Web Services, REST, JDBC and Java clients.
  • Application Layer - The "Application Layer" serves to map the Business Layer into the format which each application data consumer wants to see. It might mean formatting into XML for Web services or creating views with different alias names that match the way the consumers are used to seeing their data.
  • Business Layer - The "Business Layer" is predicated on the idea that the business has a standard or canonical way to describing key business entities such as customers and products. In the financial industry, one often accesses information according to financial instruments and issuers amongst many other entities. Typically, a data modeler would work with business experts and data providers to define a set of "logical" or "canonical" views that represent these business entities. These views are reusable components that can and should be used across business lines by multiple consumers.
  • Physical Layer - The "Physical Layer" provides access to underlying data sources and performs a physical to logical mapping.
    • The "Physical Metadata" is essentially imported from the physical data sources and used as way to onboard the metadata required by the data abstraction layer to perform its mapping functions. As an "as-is" layer, entity names and attributes are never changed in this layer.
    • The "Formatting Views" provide a way to map the physical metadata by aliasing the physical names to logical names. Additionally the formatting views can facilitate simple tasks such as value formatting, data type casting, derived columns and light data quality mapping. This layer is derived from the physical sources and performs a one-to-one mapping between the physical source attributes and their corresponding "logical/canonical" attribute name. This layer serves as a buffer between the physical source and the logical business layer views. Naming conventions are very important and introduced in this layer.
  • Data Sources -The data sources are the physical information assets that exist within and without an organization. These assets may be databases, packaged applications such as SAP, Web services, Excel spreadsheets and more.

Summary of Key Benefits
Data abstraction bridges the gap between business needs and source data's original form. This best practice implementation of data virtualization provides the following benefits:

  • Simplify information access - Bridge business and IT terminology and technology so both can succeed.
  • Common business view of the data - Gain agility, efficiency and reuse across applications via an enterprise information model or "Canonical" model.
  • More accurate data - Consistently apply data quality and validation rules across all data sources.
  • More secure data - Consistently apply data security rules across all data sources and consumers via a unified security framework.
  • End-to-end control - Use a data virtualization platform to consistently manage data access and delivery across multiple sources and consumers.
  • Business and IT change insulation - Insulate consuming applications and users from changes in the source and vice versa. Business users and pplications developers work with a more stable view of the data. IT can make ongoing changes and relocation of physical data sources without impacting information users.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

@CloudExpo Stories
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor. Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the abili...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...