Welcome!

SDN Journal Authors: Pat Romanski, Stefan Bernbo, Elizabeth White, Liz McMillan, TJ Randall

Related Topics: @CloudExpo, Java IoT, Linux Containers, Containers Expo Blog, @BigDataExpo, SDN Journal

@CloudExpo: Article

Bringing New Intelligence to Cloud Orchestration with Cloudify 3.0

Full product rewrite delivers software-defined orchestration for applications on the cloud

GigaSpaces Technologies on Monday announced it has completely re-architected its Cloudify offering to provide Intelligent Orchestration of applications on the cloud. With this product rewrite, the new Cloudify orchestration platform simplifies the application deployment, management and scaling experience on OpenStack, VMware vSphere and other clouds and environments.

"To deliver this next generation, intelligent orchestration, we needed to rethink Cloudify's design," said Yaron Parasol, VP of Product at GigaSpaces. "With a new language of code, adoption of industry standards and development of scalable and custom workflows, we created something that few are doing today - orchestration of the entire app lifecycle that encompasses both pre-deployment and post-deployment management with a single platform."

In current orchestration models, most tools focus primarily on application installation, while much of application management occurs after deployment. As a result, vast custom tool chains are often used to manually manage post-deployment processes such as monitoring and logging, leading to significant overhead, complexity and inconsistency across systems.

Cloudify's redesign provides a simple solution for managing the full application lifecycle. The new intelligent orchestration model introduces a feedback loop that automates fixes and updates without manual intervention, all with a single platform that integrates with any tool chain. Cloudify 3.0 reduces the complexity of cloud application management and ensures that managed applications meet their desired SLA.

Cloudify 3.0 Highlights:
New intelligence in orchestration:
Cloudify 3.0 eliminates the boundaries between orchestration and monitoring, providing a mechanism that automatically reacts to monitored events with appropriate corrective measures. Version 3.0 includes these building blocks of custom workflows, a workflow engine and a modeling language that enables the automation of any process and any stack. The subsequent release (due in Q4 2014) will introduce monitoring and custom policies for automated triggering of such corrective measures to provide auto-healing and auto-scaling capabilities.

Integrating the entire automation and monitoring tool chain: Cloudify 3.0 brings together a variety of tools that are used throughout the various stages of the application lifecycle. In doing so, Cloudify promotes common industry best practices, making it easy to integrate and use best-of-breed tools to manage the environment. Cloudify has a new plug-in architecture that enables easy integration of a wide range of tools for monitoring, configuration management and cloud infrastructure. Examples of such integration include Chef, Puppet, Fabric and Docker for configuration management, OpenStack Heat for infrastructure orchestration, logstash and Elasticsearch for logging and monitoring, and Reimann.IO for real-time analytics.

Native integration with OpenStack: As OpenStack is fast becoming the de facto standard for private clouds, Cloudify 3.0 offers even tighter integration with OpenStack technology and core services, including KeyStone, Neutron, Nova and Heat. The underlying design of Cloudify was re-architected to match the design principles of OpenStack services, including the rewriting of the core services in Python and leveraging common infrastructure building blocks such as RabbitMQ.

Support for VMware, CloudStack, SoftLayer and other clouds: Cloudify 3.0 contains built-in plug-ins for VMware vSphere and Apache CloudStack, and will soon have plug-ins for vCloud and SoftLayer. It comes with open plug-in architecture to support other clouds, including Amazon AWS, GCE and Linux containers such as Docker (plug-ins for all of which will be released in the coming few weeks). With Cloudify 3.0, users can span the same application across multiple cloud environments without creating a new Cloudify setup per environment. This is useful for users who are transitioning from their existing environment into a cloud environment and also for allowing bursting and hybrid deployments between OpenStack, VMware, Amazon and other clouds.

New topology-driven monitoring: Cloudify 3.0 introduces a new concept of topology-driven monitoring in which the entire application management and tracking system is centralized around the application topology, rather than the infrastructure. This makes it possible at any given moment to track not only the state of the application, but also the status of deployment, updates and scaling processes through a single view. As the monitoring system is integrated with the orchestration engine, the two systems are always in sync and up to date, eliminating the need to rely on external discover services.

Support for the TOSCA specification: TOSCA (Topology and Orchestration Specification for Cloud Applications) allows users to describe any set of automation processes on cloud applications with an extendable set of hooks and component types. Cloudify 3.0 uses a YAML-based orchestration template driven by the TOSCA specification. The next release of Cloudify will include full syntax compatibility with the TOSCA specification as soon as it becomes officially supported.

Multiple applications that can span thousands of nodes: Cloudify 3.0 allows for management and monitoring of large scale applications using a message broker to manage the communication with its managed instances and a logging and analysis engine for Big Data scale.

More Stories By Pat Romanski

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...