Welcome!

SDN Journal Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, @DXWorldExpo, SDN Journal

@CloudExpo: Article

Cloud Fabric Technology to Solve Challenges of Workload Portability

Meet Zerto at Cloud Expo New York

Zerto on Wednesday announced its strategy to create a new infrastructure layer called "Cloud Fabric" that allows organizations to seamlessly move and protect virtualized workloads between public, private and hybrid clouds across leading hypervisors and cloud providers.

Gil Levonai, VP marketing and products at Zerto, stated, "The notion of a Cloud Fabric is a concept that will evolve over time, but certainly includes ability to protect and mobilize production workloads between VMware, Microsoft, Amazon, OpenStack and in a later stage between any cloud or hypervisor. This is vital for both providers and customers to avoid lock-in and to retain their ability to choose any IT environment that fits their business needs."

Through hypervisor-based replication technology, Zerto enables the replication, orchestration, reporting and monitoring of BC/DR and migration operations across multiple sites for cloud service providers (CSPs) and enterprises. In delivering recovery, mobility and data protection solutions to more than 450 enterprises, including many Fortune 1000 companies, Zerto identified the key functionality required for production workloads to utilize any cloud:

  • A powerful transport layer for data and applications - one that is cross-hypervisor and hardware-agnostic
  • Orchestration of the mobility of complex applications
  • Encapsulation of all of the dependencies that are part of an application such as boot order, IP configuration and more
  • Production-level tools for the highest service levels of data mobility and protection - so that mobility of workloads is easy to manage and report on

Specific Zerto Cloud Fabric components will be released throughout this year.

Zerto currently helps more than 100 managed cloud service providers (CSP) including Terremark, Bluelock, Colt and Kelway to replicate workloads across VMware-based clouds.

More Stories By Elizabeth White

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. His expertise is in automating deployment, management, and problem resolution in these environments, allowing his teams to run large transactional applications with high availability and the speed the consumer demands.
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage of data and analytics in the cloud, Architecture, integration, governance and security scenarios and Key challenges and success factors of moving data and analytics to the cloud
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.