Welcome!

SDN Journal Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, TJ Randall

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Microsoft Cloud, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Challenges in Data Access for New Age Data Sources

SQL vs. NoSQL vs. NewSQL

The Big Data and Cloud "movements" have acted as catalysts for tremendous growth in fit-for-purpose databases. Along with this growth, we see a new set of challenges in how we access the data through our business-critical applications. Let's take a brief look at the evolution of these data access methods (and why we are in the mess we are in today).

The Evolution of Data Sources
Back in the '80s the development of relational databases brought with it a standardized SQL protocol that could be easily implemented within mainframe applications to query and manipulate the data. These relational database systems supported transactions in a very reliable fashion through what was called "ACID" compliance (Atomicity, Consistency, Isolation, and Durability). These databases provided a very structured method of dealing with data and were very reliable. But ACID compliance also brought along lots of overheard process. Hence a downfall - they were not optimized to handle large transaction requests, nor could they handle huge volumes of transactions. To counteract this, we've did some significant performance and throughput enhancements within data connectivity drivers that lit a fire under the SQL speeds and connectivity efficiencies.

Now move to the '90s and early 2000s, where vendors were experimenting with more efficient ways of storing the data and the advent of "NoSQL" (aka Not Only SQL). We now have multiple applications trying to access a database with new requirements for performance, scalability, and volume. These databases employ one of several different storage models:

  • Key-Value, designed to handle massive amounts of data
  • BigTable, column-oriented relational databases based on Google technology
  • Document, similar to key-value where the value is the document
  • Graph, scaling to the complexity of the data and modeling the structure of the data

These databases sacrificed transaction-oriented access for speed and scalability. However, there was no standardized, optimized access method like SQL. In general, the best way to query was through the REST API and Web services. Each NoSQL database usually had a proprietary method of accessing, but that caused frequent API changes to your applications when dealing with multiple databases. And, with some packaged applications those frequent modifications may not even be possible.

That brings us to the needs of today for multiple applications requiring access to multiple fit-for-purpose databases using alternate data storage models and needing different access methods. Here comes NewSQL, which is supposed to fill the gap left by NOSQL with better support for ACID transactions while retaining the performance and scalability characteristics. NewSQL fulfills the needs for today's data markets with highly scalable, high-performance, transaction-intensive data store options. The adoption of these NewSQL alternatives is slow though, but I would expect to see a rise once more tools support it. The challenge here is having to rewrite how we access this data. The access method is a hybrid SQL, so it will take some effort before more vendor tools and middleware drivers support it. Plus, the impact to application development will have to be considered, given the new ways required to access the data.

All of these - SQL, NoSQL, NewSQL (and more, like in-memory) - have a distinguished place in today's world of data. Each is customized to fulfill the needs of the megatrends like Big Data and cloud computing. They've opened up new markets for better access methods that have low impact on existing business-critical applications. And being able to connect to the world's data in a fast and consistent fashion will continue to be key to the data castle.

Database Wars - Only This Time in the Cloud
If you've been in the technology business long enough, you remember the "database wars" of the 1990s. During that era, there were more than 15 different types of databases, all vying to house your company's data. There were so many database options that knowing where to house your data, and how to access it, became quite an issue. However, as Y2K rolled around, the database vendors dwindled back down to a much more manageable number.

So much content is generated these days (500 TB just on Facebook alone) that accelerated processing power and disk storage access is required. Today, with offerings from major players like Oracle and Salesforce, along with open source databases like Apache Hadoop Hive, we are getting back up there in terms of database offerings. The industry is once again inundated with databases, which is causing data to be siloed.

We can thank two megatrends for the explosion of databases that are flooding the market today. The first is Big Data. Every day, we create 2.5 quintillion bytes of data - so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is BIG Data, and the volume of it that needs to be managed by applications is increasing dramatically. But it's not only a volume problem, because the velocity and variety of data are increasing as well. For data at rest, like the petabytes of data managed by the world's largest Hadoop clusters, we need to be able access it quickly, reliably, and securely. For data in motion, like your location, we need to analyze it and respond immediately before the window on the fleeting opportunity or preventable threat closes. Big Data and the introduction of Apache Hadoop as a high-volume distributed file system have drawn a line in the sand for the first battle in the new database wars.

The second is cloud computing. Cloud is reshaping the way we as an industry build and deploy software. The economics and usability of cloud are clear - cloud is enabling the next generation of ISVs and applications to be built in less time, at lower cost, all the while increasing the scalability and resiliency of the applications we produce. In fact, ISVs are ahead of the curve - according to Gartner over 50% of ISVs are building pure cloud applications within the next three years, and 20% of IT spending in the next three years is going to cloud- and SaaS-based services. The use of hybrid applications will exceed both on-premise and cloud in the near term as the market transitions from on-premise to pure cloud. Big Data and cloud are changing the rules for how we access and use the data. They are changing the rules for how we can all uncover the "dark data" as we mine the new wave databases.

Alternative Data Management Technologies Fuel the Fire
The database wars today are being fueled by key factors that drive the adoption of up-and-coming data management technologies. According to 451 Research, these factors include scalability, performance, relaxed consistency, agility, intricacy, and necessity. NoSQL projects were developed in response to the failure of existing suppliers to meet the performance, scalability and flexibility needs of large-scale data processing, particularly for Web and cloud applications. While the NoSQL offerings are closely associated with Web application providers, the same drivers have spurred the adoption of data-grid/caching products and the emergence of a new breed of relational database products and vendors. For the most part, these database alternatives are not designed to directly replace existing products, but to offer purpose-built alternatives for workloads that are unsuited to general-purpose relational databases. NewSQL and data-grid products have emerged to meet similar requirements among enterprises, a sector that is now also being targeted by NoSQL vendors. The list of new database players with alternative management methods is growing seemingly exponentially. In today's wars, the backdrop is no longer the on-premise databases of yesteryear; today's wars are happening in the cloud. The new rules of accessing cloud data cause new challenges in business-critical applications.

What does this mean for an enterprise that needs to access its data from a number of diverse cloud sources? What light saber exists in today's world to aid IT managers and application developers in these fierce wars? How can we keep up with this explosion in data sources in the cloud? One of the biggest weapons that today's IT workers have at their disposal is a premium data connectivity service. Point-to-point connectivity might be available, but creating unique access calls that conform to every database API becomes too unwieldy and complex. There are too many different APIs and too many different versions of those APIs making your application way too complicated to maintain. For on-premise applications, the changes across all of these cloud data sources are just too frequent to manage.

Controlling the Chaos through a Connectivity Service
There is a much better way to connect and access the multitude of cloud data sources - a single pipe into a connectivity management service that sits in the cloud. The call from your application conforms to standard SQL queries along with a quick selection of which cloud data source you need to connect with. The connectivity service executes the SQL query against the appropriate cloud data source, managing all of the complexity, APIs, and version control itself so that your application doesn't have to. This Connectivity as a Service provides standards-based SQL access and connectivity management to the cloud. The service allows you to pay for only what you consume or how many cloud data sources you might need to get to. It enables you to focus on your application, while the connectivity management service keeps up with versions and API changes.

Data sources can be added continuously with no changes required in your application. This is the beauty of Connectivity as a Service, enabling you to access the cloud world through a single source. A service offering that leverages database drivers for cloud-based data sources is ideal. Database drivers come to the rescue by unlocking data and helping to move it freely between various databases, which facilitate fast decision-making. SQL drivers such as ODBC and JDBC add tremendous value when it comes to database connectivity, especially when assessing high-volume, critical systems. These drivers are compatible with essentially every database, offering superior performance, resource efficiency and codeless configuration.

Even as the database wars heat up, premium data connectivity solutions will help you cool down - accessing and analyzing your data no matter where it may live.

More Stories By Jeff Reser

Jeff Reser is the Senior Manager, Technical Marketing at Progress Software. Before that, he was responsible for Business Process Management Solutions Marketing. Prior to Progress, he spent more than 25 years at IBM where he held a number of technical and management positions and was instrumental in developing and product managing IBM’s WebSphere Application Server – from its inception to an expanding and very successful portfolio of products. With over 30 years of experience in software technologies, product management, and product marketing, Jeff’s areas of expertise include Big Data & Cloud connectivity, Business Process Management, Business Rules Management, Web application serving, and mobile computing.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed by some of the world's largest financial institutions. The company develops and applies innovative machine-learning technologies to big data to predict financial, economic, and world events. The team is a group of passionate technologists, mathematicians, data scientists and programmers in Silicon Valley with over 100 patents to their names. Big Data Federation was incorporated in 2015 and is ...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by researching target group and involving users in the designing process.
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments that frequently get lost in the hype. The panel will discuss their perspective on what they see as they key challenges and/or impediments to adoption, and how they see those issues could be resolved or mitigated.
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.