Welcome!

SDN Journal Authors: Daniel Gordon, John Walsh, Elizabeth White, Liz McMillan, Sven Olav Lund

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Microsoft Cloud, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Challenges in Data Access for New Age Data Sources

SQL vs. NoSQL vs. NewSQL

The Big Data and Cloud "movements" have acted as catalysts for tremendous growth in fit-for-purpose databases. Along with this growth, we see a new set of challenges in how we access the data through our business-critical applications. Let's take a brief look at the evolution of these data access methods (and why we are in the mess we are in today).

The Evolution of Data Sources
Back in the '80s the development of relational databases brought with it a standardized SQL protocol that could be easily implemented within mainframe applications to query and manipulate the data. These relational database systems supported transactions in a very reliable fashion through what was called "ACID" compliance (Atomicity, Consistency, Isolation, and Durability). These databases provided a very structured method of dealing with data and were very reliable. But ACID compliance also brought along lots of overheard process. Hence a downfall - they were not optimized to handle large transaction requests, nor could they handle huge volumes of transactions. To counteract this, we've did some significant performance and throughput enhancements within data connectivity drivers that lit a fire under the SQL speeds and connectivity efficiencies.

Now move to the '90s and early 2000s, where vendors were experimenting with more efficient ways of storing the data and the advent of "NoSQL" (aka Not Only SQL). We now have multiple applications trying to access a database with new requirements for performance, scalability, and volume. These databases employ one of several different storage models:

  • Key-Value, designed to handle massive amounts of data
  • BigTable, column-oriented relational databases based on Google technology
  • Document, similar to key-value where the value is the document
  • Graph, scaling to the complexity of the data and modeling the structure of the data

These databases sacrificed transaction-oriented access for speed and scalability. However, there was no standardized, optimized access method like SQL. In general, the best way to query was through the REST API and Web services. Each NoSQL database usually had a proprietary method of accessing, but that caused frequent API changes to your applications when dealing with multiple databases. And, with some packaged applications those frequent modifications may not even be possible.

That brings us to the needs of today for multiple applications requiring access to multiple fit-for-purpose databases using alternate data storage models and needing different access methods. Here comes NewSQL, which is supposed to fill the gap left by NOSQL with better support for ACID transactions while retaining the performance and scalability characteristics. NewSQL fulfills the needs for today's data markets with highly scalable, high-performance, transaction-intensive data store options. The adoption of these NewSQL alternatives is slow though, but I would expect to see a rise once more tools support it. The challenge here is having to rewrite how we access this data. The access method is a hybrid SQL, so it will take some effort before more vendor tools and middleware drivers support it. Plus, the impact to application development will have to be considered, given the new ways required to access the data.

All of these - SQL, NoSQL, NewSQL (and more, like in-memory) - have a distinguished place in today's world of data. Each is customized to fulfill the needs of the megatrends like Big Data and cloud computing. They've opened up new markets for better access methods that have low impact on existing business-critical applications. And being able to connect to the world's data in a fast and consistent fashion will continue to be key to the data castle.

Database Wars - Only This Time in the Cloud
If you've been in the technology business long enough, you remember the "database wars" of the 1990s. During that era, there were more than 15 different types of databases, all vying to house your company's data. There were so many database options that knowing where to house your data, and how to access it, became quite an issue. However, as Y2K rolled around, the database vendors dwindled back down to a much more manageable number.

So much content is generated these days (500 TB just on Facebook alone) that accelerated processing power and disk storage access is required. Today, with offerings from major players like Oracle and Salesforce, along with open source databases like Apache Hadoop Hive, we are getting back up there in terms of database offerings. The industry is once again inundated with databases, which is causing data to be siloed.

We can thank two megatrends for the explosion of databases that are flooding the market today. The first is Big Data. Every day, we create 2.5 quintillion bytes of data - so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is BIG Data, and the volume of it that needs to be managed by applications is increasing dramatically. But it's not only a volume problem, because the velocity and variety of data are increasing as well. For data at rest, like the petabytes of data managed by the world's largest Hadoop clusters, we need to be able access it quickly, reliably, and securely. For data in motion, like your location, we need to analyze it and respond immediately before the window on the fleeting opportunity or preventable threat closes. Big Data and the introduction of Apache Hadoop as a high-volume distributed file system have drawn a line in the sand for the first battle in the new database wars.

The second is cloud computing. Cloud is reshaping the way we as an industry build and deploy software. The economics and usability of cloud are clear - cloud is enabling the next generation of ISVs and applications to be built in less time, at lower cost, all the while increasing the scalability and resiliency of the applications we produce. In fact, ISVs are ahead of the curve - according to Gartner over 50% of ISVs are building pure cloud applications within the next three years, and 20% of IT spending in the next three years is going to cloud- and SaaS-based services. The use of hybrid applications will exceed both on-premise and cloud in the near term as the market transitions from on-premise to pure cloud. Big Data and cloud are changing the rules for how we access and use the data. They are changing the rules for how we can all uncover the "dark data" as we mine the new wave databases.

Alternative Data Management Technologies Fuel the Fire
The database wars today are being fueled by key factors that drive the adoption of up-and-coming data management technologies. According to 451 Research, these factors include scalability, performance, relaxed consistency, agility, intricacy, and necessity. NoSQL projects were developed in response to the failure of existing suppliers to meet the performance, scalability and flexibility needs of large-scale data processing, particularly for Web and cloud applications. While the NoSQL offerings are closely associated with Web application providers, the same drivers have spurred the adoption of data-grid/caching products and the emergence of a new breed of relational database products and vendors. For the most part, these database alternatives are not designed to directly replace existing products, but to offer purpose-built alternatives for workloads that are unsuited to general-purpose relational databases. NewSQL and data-grid products have emerged to meet similar requirements among enterprises, a sector that is now also being targeted by NoSQL vendors. The list of new database players with alternative management methods is growing seemingly exponentially. In today's wars, the backdrop is no longer the on-premise databases of yesteryear; today's wars are happening in the cloud. The new rules of accessing cloud data cause new challenges in business-critical applications.

What does this mean for an enterprise that needs to access its data from a number of diverse cloud sources? What light saber exists in today's world to aid IT managers and application developers in these fierce wars? How can we keep up with this explosion in data sources in the cloud? One of the biggest weapons that today's IT workers have at their disposal is a premium data connectivity service. Point-to-point connectivity might be available, but creating unique access calls that conform to every database API becomes too unwieldy and complex. There are too many different APIs and too many different versions of those APIs making your application way too complicated to maintain. For on-premise applications, the changes across all of these cloud data sources are just too frequent to manage.

Controlling the Chaos through a Connectivity Service
There is a much better way to connect and access the multitude of cloud data sources - a single pipe into a connectivity management service that sits in the cloud. The call from your application conforms to standard SQL queries along with a quick selection of which cloud data source you need to connect with. The connectivity service executes the SQL query against the appropriate cloud data source, managing all of the complexity, APIs, and version control itself so that your application doesn't have to. This Connectivity as a Service provides standards-based SQL access and connectivity management to the cloud. The service allows you to pay for only what you consume or how many cloud data sources you might need to get to. It enables you to focus on your application, while the connectivity management service keeps up with versions and API changes.

Data sources can be added continuously with no changes required in your application. This is the beauty of Connectivity as a Service, enabling you to access the cloud world through a single source. A service offering that leverages database drivers for cloud-based data sources is ideal. Database drivers come to the rescue by unlocking data and helping to move it freely between various databases, which facilitate fast decision-making. SQL drivers such as ODBC and JDBC add tremendous value when it comes to database connectivity, especially when assessing high-volume, critical systems. These drivers are compatible with essentially every database, offering superior performance, resource efficiency and codeless configuration.

Even as the database wars heat up, premium data connectivity solutions will help you cool down - accessing and analyzing your data no matter where it may live.

More Stories By Jeff Reser

Jeff Reser is the Senior Manager, Technical Marketing at Progress Software. Before that, he was responsible for Business Process Management Solutions Marketing. Prior to Progress, he spent more than 25 years at IBM where he held a number of technical and management positions and was instrumental in developing and product managing IBM’s WebSphere Application Server – from its inception to an expanding and very successful portfolio of products. With over 30 years of experience in software technologies, product management, and product marketing, Jeff’s areas of expertise include Big Data & Cloud connectivity, Business Process Management, Business Rules Management, Web application serving, and mobile computing.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored applications in several industries and discussed technologies that allow the deployment of advanced visualization solutions to the cloud.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...