Welcome!

SDN Journal Authors: Pat Romanski, Elizabeth White, Gil Allouche, Lori MacVittie, Dana Gardner

Related Topics: Cloud Expo, SOA & WOA, Virtualization, Security, Big Data Journal, SDN Journal

Cloud Expo: Article

Cloud Backup & Disaster Recovery Predictions for 2014

Cloud backup adoption made great strides in 2013 but it was also a year of learning lessons

Author: Nick Mueller, Zetta.net

2013 was a banner year for cloud backup adoption. It was also a year of wake-up calls: simple cloud backup doesn't constitute DR, transfer speeds are vital, and beware cloud as a commodity. Let's look to 2014 for more trends in cloud backup and DR in the cloud.

1. Cloud backup vendors with slow performance will fail
This trend showed up in the news as well-known cloud backup vendors ceased production. Symantec Backup Exec Cloud was the biggest casualty of slow performance, and more cloud hosting products struggled or went down because they were not optimized for speed of backup or recovery.

Just offering a backup to cloud option isn't enough anymore. Users appreciate the scalability and cost-effectiveness of the cloud but they also want the same level of performance in backup and restore that they had on-premise. Native cloud optimization for the Internet and high data transfer speeds are the only way to achieve this performance level.

2. MSPs and VARs Will Embrace Cloud Backup as a Profit Center
Many MSPs and VARs want to offer Disaster Recovery as a Service (DRaaS) because their customers want it. But profit margins are thin and it's tough to bring in enough revenue to make a profit, let alone invest in higher priced services. MSPs know they need to raise recurring revenue and lower client backup management costs, and they need a cost-effective and high performance service to make that happen. MSPs can build recurring revenue by offering features like these:

  • Optimize cloud backup and recovery so the MSP's service is at least as fast as their customers' on-premise backup.
  • Earn customer trust with verified backup, which takes the burden of continual checking off their customers' backs.
  • Offer a high performance cloud without high cost to the MSP. This means a solution that's natively optimized for the cloud without the costs of a hardware appliance.

Profit margins grow with less cost on the backend and the MSP sees recurring revenue from happy customers. MSPs and VARs can afford to develop premium services and faster go-to-market.

3. Appliance-Based Backup Solutions Show their Age
A lot of traditional backup vendors extended backup to the cloud using on-premise appliances. Early appliances were an innovative way to collect backup data and then transfer it to the cloud; they were also a good way for backup vendors with big installed bases to keep their customers.

The problem is that an on-premise appliance can be an expensive proposition and does little to accelerate cloud performance. This is particularly awkward if you are an MSP: customer site hardware appliances need repair and replacement, and software appliances need troubleshooting and upgrades.

You can replace appliances with a cloud-native cloud backup offering with features like WAN optimization and multi-channel communications like REST. This takes the burden of supporting appliances off the table and speeds up performance.

4. Commodity Cloud Doesn't Work Unless Your Name is Azure, Amazon or Google
Big established cloud hosts make money with their huge economies of scale and the volume of data they have under management. Otherwise commodity cloud is a losing game as Nirvanix can attest. Nirvanix declared Chapter 11 because they decided to build their own cloud infrastructure. In spite of big clients like IBM they could not boost enough revenue to counter their astronomical expenditure.

Moving into 2014, the brightest opportunities in the cloud are specific business solutions focused on businesses that are willing to pay for them. Disaster recovery in the cloud is one of the most attractive offerings moving forward. True cloud DR isn't just backup to the cloud. It's specifically architected and custom built for enterprise-grade backup and recovery performance.

Zetta for example is specifically built for the Internet with WAN optimization and accelerated cloud performance, and achieves speeds that are often faster than local backup.

5. Companies Realize that Cloud Backup Means Cloud Recovery Too
Recovering data over a slow pipe can be even worse than the initial backup: just when companies need to recover their data fast, slow data transfer speeds threaten the whole recovery process.

Commodity cloud storage from legacy backup software won't work for companies with high performance recovery needs. Recovery from the cloud is all about RTO and that's just what legacy backup to the cloud can't do. Some vendors get around the problem by downloading customer recovery files on disk and trucking them to the customer site. Better than an impossibly long recovery time over the WAN but hardly the stuff that dreams are made of.

Companies avoid this problem by only backing up data that they can stand to recover in a longer period of time. They can't apply the economies of the cloud for priority data that is constrained by recovery time, unless they turn to an optimized cloud backup and DR offering. We'll see a lot more of this customer movement to cloud-native services in 2014.

We built these predictions for 2014 on 2013 events and Zetta customer needs and wants.

Nick is Zetta's Chief Content Officer, and has been working with writing and social media teams to create digital content since the days when the BBS reign

More Stories By Derek Kol

Derek Kol is a technology specialist focused on SMB and enterprise IT innovations.

@CloudExpo Stories
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
Some developers believe that monitoring is a function of the operations team. Some operations teams firmly believe that monitoring the systems they maintain is sufficient to run the business successfully. Most of them are wrong. The complexity of today's applications have gone far and beyond the capabilities of "traditional" system-level monitoring tools and approaches and requires much broader knowledge of business and applications as a whole. The goal of DevOps is to connect all aspects of app...
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
"We help companies that are using a lot of Software as a Service. We help companies manage and gain visibility into what people are using inside the company and decide to secure them or use standards to lock down or to embrace the adoption of SaaS inside the company," explained Scott Kriz, Co-founder and CEO of Bitium, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
"SAP had made a big transition into the cloud as we believe it has significant value for our customers, drives innovation and is easy to consume. When you look at the SAP portfolio, SAP HANA is the underlying platform and it powers all of our platforms and all of our analytics," explained Thorsten Leiduck, VP ISVs & Digital Commerce at SAP, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
"Cloud consumption is something we envision at Solgenia. That is trying to let the cloud spread to the user as a consumption, as utility computing. We want to allow the people to just pay for what they use, not a subscription model," explained Ermanno Bonifazi, CEO & Founder of Solgenia, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. Acco...
"For the past 4 years we have been working mainly to export. For the last 3 or 4 years the main market was Russia. In the past year we have been working to expand our footprint in Europe and the United States," explained Andris Gailitis, CEO of DEAC, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...