Welcome!

SDN Journal Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Microsoft Cloud, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Challenges in Data Access for New Age Data Sources

SQL vs. NoSQL vs. NewSQL

The Big Data and Cloud "movements" have acted as catalysts for tremendous growth in fit-for-purpose databases. Along with this growth, we see a new set of challenges in how we access the data through our business-critical applications. Let's take a brief look at the evolution of these data access methods (and why we are in the mess we are in today).

The Evolution of Data Sources
Back in the '80s the development of relational databases brought with it a standardized SQL protocol that could be easily implemented within mainframe applications to query and manipulate the data. These relational database systems supported transactions in a very reliable fashion through what was called "ACID" compliance (Atomicity, Consistency, Isolation, and Durability). These databases provided a very structured method of dealing with data and were very reliable. But ACID compliance also brought along lots of overheard process. Hence a downfall - they were not optimized to handle large transaction requests, nor could they handle huge volumes of transactions. To counteract this, we've did some significant performance and throughput enhancements within data connectivity drivers that lit a fire under the SQL speeds and connectivity efficiencies.

Now move to the '90s and early 2000s, where vendors were experimenting with more efficient ways of storing the data and the advent of "NoSQL" (aka Not Only SQL). We now have multiple applications trying to access a database with new requirements for performance, scalability, and volume. These databases employ one of several different storage models:

  • Key-Value, designed to handle massive amounts of data
  • BigTable, column-oriented relational databases based on Google technology
  • Document, similar to key-value where the value is the document
  • Graph, scaling to the complexity of the data and modeling the structure of the data

These databases sacrificed transaction-oriented access for speed and scalability. However, there was no standardized, optimized access method like SQL. In general, the best way to query was through the REST API and Web services. Each NoSQL database usually had a proprietary method of accessing, but that caused frequent API changes to your applications when dealing with multiple databases. And, with some packaged applications those frequent modifications may not even be possible.

That brings us to the needs of today for multiple applications requiring access to multiple fit-for-purpose databases using alternate data storage models and needing different access methods. Here comes NewSQL, which is supposed to fill the gap left by NOSQL with better support for ACID transactions while retaining the performance and scalability characteristics. NewSQL fulfills the needs for today's data markets with highly scalable, high-performance, transaction-intensive data store options. The adoption of these NewSQL alternatives is slow though, but I would expect to see a rise once more tools support it. The challenge here is having to rewrite how we access this data. The access method is a hybrid SQL, so it will take some effort before more vendor tools and middleware drivers support it. Plus, the impact to application development will have to be considered, given the new ways required to access the data.

All of these - SQL, NoSQL, NewSQL (and more, like in-memory) - have a distinguished place in today's world of data. Each is customized to fulfill the needs of the megatrends like Big Data and cloud computing. They've opened up new markets for better access methods that have low impact on existing business-critical applications. And being able to connect to the world's data in a fast and consistent fashion will continue to be key to the data castle.

Database Wars - Only This Time in the Cloud
If you've been in the technology business long enough, you remember the "database wars" of the 1990s. During that era, there were more than 15 different types of databases, all vying to house your company's data. There were so many database options that knowing where to house your data, and how to access it, became quite an issue. However, as Y2K rolled around, the database vendors dwindled back down to a much more manageable number.

So much content is generated these days (500 TB just on Facebook alone) that accelerated processing power and disk storage access is required. Today, with offerings from major players like Oracle and Salesforce, along with open source databases like Apache Hadoop Hive, we are getting back up there in terms of database offerings. The industry is once again inundated with databases, which is causing data to be siloed.

We can thank two megatrends for the explosion of databases that are flooding the market today. The first is Big Data. Every day, we create 2.5 quintillion bytes of data - so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is BIG Data, and the volume of it that needs to be managed by applications is increasing dramatically. But it's not only a volume problem, because the velocity and variety of data are increasing as well. For data at rest, like the petabytes of data managed by the world's largest Hadoop clusters, we need to be able access it quickly, reliably, and securely. For data in motion, like your location, we need to analyze it and respond immediately before the window on the fleeting opportunity or preventable threat closes. Big Data and the introduction of Apache Hadoop as a high-volume distributed file system have drawn a line in the sand for the first battle in the new database wars.

The second is cloud computing. Cloud is reshaping the way we as an industry build and deploy software. The economics and usability of cloud are clear - cloud is enabling the next generation of ISVs and applications to be built in less time, at lower cost, all the while increasing the scalability and resiliency of the applications we produce. In fact, ISVs are ahead of the curve - according to Gartner over 50% of ISVs are building pure cloud applications within the next three years, and 20% of IT spending in the next three years is going to cloud- and SaaS-based services. The use of hybrid applications will exceed both on-premise and cloud in the near term as the market transitions from on-premise to pure cloud. Big Data and cloud are changing the rules for how we access and use the data. They are changing the rules for how we can all uncover the "dark data" as we mine the new wave databases.

Alternative Data Management Technologies Fuel the Fire
The database wars today are being fueled by key factors that drive the adoption of up-and-coming data management technologies. According to 451 Research, these factors include scalability, performance, relaxed consistency, agility, intricacy, and necessity. NoSQL projects were developed in response to the failure of existing suppliers to meet the performance, scalability and flexibility needs of large-scale data processing, particularly for Web and cloud applications. While the NoSQL offerings are closely associated with Web application providers, the same drivers have spurred the adoption of data-grid/caching products and the emergence of a new breed of relational database products and vendors. For the most part, these database alternatives are not designed to directly replace existing products, but to offer purpose-built alternatives for workloads that are unsuited to general-purpose relational databases. NewSQL and data-grid products have emerged to meet similar requirements among enterprises, a sector that is now also being targeted by NoSQL vendors. The list of new database players with alternative management methods is growing seemingly exponentially. In today's wars, the backdrop is no longer the on-premise databases of yesteryear; today's wars are happening in the cloud. The new rules of accessing cloud data cause new challenges in business-critical applications.

What does this mean for an enterprise that needs to access its data from a number of diverse cloud sources? What light saber exists in today's world to aid IT managers and application developers in these fierce wars? How can we keep up with this explosion in data sources in the cloud? One of the biggest weapons that today's IT workers have at their disposal is a premium data connectivity service. Point-to-point connectivity might be available, but creating unique access calls that conform to every database API becomes too unwieldy and complex. There are too many different APIs and too many different versions of those APIs making your application way too complicated to maintain. For on-premise applications, the changes across all of these cloud data sources are just too frequent to manage.

Controlling the Chaos through a Connectivity Service
There is a much better way to connect and access the multitude of cloud data sources - a single pipe into a connectivity management service that sits in the cloud. The call from your application conforms to standard SQL queries along with a quick selection of which cloud data source you need to connect with. The connectivity service executes the SQL query against the appropriate cloud data source, managing all of the complexity, APIs, and version control itself so that your application doesn't have to. This Connectivity as a Service provides standards-based SQL access and connectivity management to the cloud. The service allows you to pay for only what you consume or how many cloud data sources you might need to get to. It enables you to focus on your application, while the connectivity management service keeps up with versions and API changes.

Data sources can be added continuously with no changes required in your application. This is the beauty of Connectivity as a Service, enabling you to access the cloud world through a single source. A service offering that leverages database drivers for cloud-based data sources is ideal. Database drivers come to the rescue by unlocking data and helping to move it freely between various databases, which facilitate fast decision-making. SQL drivers such as ODBC and JDBC add tremendous value when it comes to database connectivity, especially when assessing high-volume, critical systems. These drivers are compatible with essentially every database, offering superior performance, resource efficiency and codeless configuration.

Even as the database wars heat up, premium data connectivity solutions will help you cool down - accessing and analyzing your data no matter where it may live.

More Stories By Jeff Reser

Jeff Reser is the Senior Manager, Technical Marketing at Progress Software. Before that, he was responsible for Business Process Management Solutions Marketing. Prior to Progress, he spent more than 25 years at IBM where he held a number of technical and management positions and was instrumental in developing and product managing IBM’s WebSphere Application Server – from its inception to an expanding and very successful portfolio of products. With over 30 years of experience in software technologies, product management, and product marketing, Jeff’s areas of expertise include Big Data & Cloud connectivity, Business Process Management, Business Rules Management, Web application serving, and mobile computing.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also received the prestigious Outstanding Technical Achievement Award three times - an accomplishment befitting only the most innovative thinkers. Shankar Kalyana is among the most respected strategists in the global technology industry. As CTO, with over 32 years of IT experience, Mr. Kalyana has architected, designed, developed, and implemented custom and packaged software solutions across a vast spectrum o...
Despite being the market leader, we recognized the need to transform and reinvent our business at Dynatrace, before someone else disrupted the market. Over the course of three years, we changed everything - our technology, our culture and our brand image. In this session we'll discuss how we navigated through our own innovator's dilemma, and share takeaways from our experience that you can apply to your own organization.
Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, software-defined solution with rich machine intelligence.
Intel is an American multinational corporation and technology company headquartered in Santa Clara, California, in the Silicon Valley. It is the world's second largest and second highest valued semiconductor chip maker based on revenue after being overtaken by Samsung, and is the inventor of the x86 series of microprocessors, the processors found in most personal computers (PCs). Intel supplies processors for computer system manufacturers such as Apple, Lenovo, HP, and Dell. Intel also manufactures motherboard chipsets, network interface controllers and integrated circuits, flash memory, graphics chips, embedded processors and other devices related to communications and computing.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve full cloud literacy in the enterprise world.