Welcome!

SDN Journal Authors: Pat Romanski, Destiny Bertucci, Liz McMillan, Elizabeth White, Amitabh Sinha

Related Topics: Open Source Cloud, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, @DXWorldExpo, SDN Journal

Open Source Cloud: Article

Achieving Scale and Performance in the Cloud

To achieve the full benefit of cloud computing, all assets need to be fully elastic

New breakthroughs in cloud-based data management empower databases with the necessary elasticity they need to be truly responsive to the ebbs and tides of supply and demand.

Cloud computing allows all capital assets - computing power, memory and storage for example - to be exchanged at the best price, giving everyone the best value for their money. Like any free market, it will only deliver its full benefits to buyers and sellers if the right conditions are available. There can be no barriers to entry, and assets in the cloud must be capable of free movement.

Unfortunately, the unsuitability of traditional relational database management systems (RDBMS) has created such a blockage. Their lack of elasticity or liquidity demobilizes computing resources. However, new developments in cloud database technology (like database bursting and hibernating functions) show how the database component can have the necessary fluidity to bring cloud-computing closer to ‘perfect market' conditions and begin to deliver its full benefits.

Consider the empirical evidence of how the fluidity of assets is changing. The limitations of a traditional relational database (RDB) meant that it ran on its own server. By contrast, in March 2013 HP demonstrated how 72,000 databases were able to share a single HP Moonshot server. Not only could they co-exist, they were able to ramp up and wind down their use of the computing capacity, so that trade-offs of processing power, storage and memory could be made as their own supply and demand changed.

Each database was constructed using a new agile set of processes that were designed to give it the fluency described above. If a database is inactive, it can be effectively switched off, and the capacity it didn't use (such as memory, storage and processing power) could be re-allocated to the rest of the pool. This relatively new concept, known as hibernation, brings a new fluidity to cloud computing. It allows assets to be mobilized between different consumers (namely the servers running those databases).

The active ingredient in the new cloud management operating systems is software that can monitor a server and understand the activity in each database. It can see whether its current workload justifies the amount of resources (such as memory, storage, and processing power) that are currently allocated, and make an intelligent decision on whether to adjust it. In terms of the free market, the management system enjoys perfect information and asset liquidity.

New technology enables server management software to spot when databases are not active and put them in hibernation until they next need to be allocated resources. On the other hand, if a SQL client needs to access the database, it can be restarted instantly.

Another mobility improvement stems from the fact that databases are not geographically restricted anymore. Unlike RDBs, they can spread onto multiple servers across multiple locations. Databases can both burst out and hibernate, expanding and contracting in immediate response to the demands made on them by users. They have complete elasticity in response to the unpredictable demands of their user base.

Traditionally, it didn't matter that a RDB sat on a single server on one site because the market conditions for computing resources were a lot simpler. The volume of demand was literally a microcosm of today's market. When RDBs were conceived, the idea of computing 1,000 transactions per second was thought to be beyond the realms of possibility. These days Facebook can make a million transactions every second across its various data centers at peak time.

Nobody can afford to buy their IT capacity the way they did in the '80s. It is technically impossible and certainly unwise to buy a data processing environment that covers all possibilities, and then leave much of it unused for 90 percent of the time, because the upper limits of demand are so much more extreme. There needs to be a fluid market for computing capacity, which the cloud could provide.

Many of the ideal conditions needed for perfect cloud computing, as defined by analyst firm Gartner, are achievable now. Any cloud computing environment can meet Gartner's stipulation that it be Internet-enabled, service-based, metered and shared fairly easily. But there is one quality that has eluded us: elasticity. Without elasticity, resources cannot be shared, and the exchange of data processing goods and services lacks the fluidity to be able to react quickly to changing market conditions.

The result of this is that the users of cloud computing risk either overprovisioning their computing resources or under-provisioning. However, in a world where markets fluctuate and customers are as mobile and quick to migrate as any other element in the economic equation, underestimating your IT resources is not an option.

That creates a massive challenge for the CIO. When online demand fluctuates as dramatically as it does today, how does one estimate the maximum limits of demand? In peak times, Amazon's trade rises by 500,000 transactions per second above its routine level of activity.

When RDBs were conceived, market conditions were much different. Databases were smaller, transactions didn't involve unstructured data and were much simpler and the peaks and troughs of activity were less extreme. The limitations of RDB (such as the lack of elasticity its intransigence imposed on the computing model) were less telling, because enterprises could afford to overprovision their processing power, storage and memory, even if that meant that 90 percent of the time it was unused.

Today elasticity is an option, if you redesign the way the database works. But massively overprovisioning capacity is not, as the peak of activity is far higher - not just for Amazon but for all online traders.

The cost of these transactions can be managed with the efficiencies of the cloud. According to the independent Yahoo Cloud Serving Benchmark, the most cost-efficient elastic cloud database on the market can generate 4,368,421 transactions per second for every dollar invested in cloud computing.

This astonishingly low cost per transaction is achieved by creating the perfect market for computing assets. Aggregating databases has created huge economies of scale, while taking advantages of the commoditization of hardware. Coupled with a far more efficient model of distribution, the price per transaction has fallen dramatically.

While other aspects of IT (such as the development of CPU, memory and software) have continued to improve exponentially every 18 months for two decades, database technology has remained relatively static. However, new cloud data management systems have released this handbrake and could create a sudden surge in mobility.

More Stories By Wiqar Chaudry

Wiqar Chaudry, Tech Evangelist & Director of Product Marketing, NuoDB, Inc., is an IT professional with over a decade of experience in database systems and web technologies. He has been responsible for designing large scale data warehouses for the Fortune 500 and has played key roles in several data centric start-ups as a solutions architect, sales engineer and product manager. Wiqar holds a BBA in Finance from Temple University and an MS in Computer Information systems from Boston University. In his spare time he writes a blog on collaborative economics. You can follow him on twitter @WiqarC.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...