Welcome!

SDN Journal Authors: Pat Romanski, Liz McMillan, Stefan Bernbo, Elizabeth White, TJ Randall

Related Topics: @CloudExpo, Java IoT, Linux Containers, Containers Expo Blog, Agile Computing, SDN Journal

@CloudExpo: Article

Why the Cloud Is Disrupting Everything

Cloud is accelerating disruption by changing how data centers deploy, develop & consume everything from software & hardware

Is it just me, or has there been an explosion of buzz words lately? Don't get me wrong, the IT industry innovates at a crazy pace normally, but it seems that things have been evolving faster than ever and that a fundamental change in the way things are done is underway. We can attribute this change to one thing: the cloud. Cloud computing is by no means new, but in 2014 it has come into its own.

Cloud computing is accelerating disruption by changing how data centers deploy, develop and consume everything from software and hardware,  to how they offer products and services to their customers.

Let's take a look at a few of these hot technologies and why you'll be adopting some of them, whether you realize it now or not.

Software-Defined Networking (SDN) - What Is It Anyway?
There are many different descriptions of SDN floating around, partly because this is relatively new technology, and it means different things to different vendors. Until the market matures, this confusion will probably persist, at least for a while. The following explanation provides a good foundation for understanding SDN.

SDN decouples the system that makes decisions about where traffic is sent (the control plane) from the underlying systems that forward traffic to the selected destination (the data plane). The inventors and vendors of these systems claim that this simplifies networking.[1] Through the controller, network administrators can quickly and easily make and push out decisions on how the underlying systems (switches, routers) of the forwarding plane will handle the traffic.

SDN requires some method for the control plane to communicate with the data plane. One such mechanism, OpenFlow, is often misunderstood to be equivalent to SDN, but other mechanisms could also fit into the concept.

By separating the control plane from the forwarding planes, data centers can reduce costs and provide better agility, and who wouldn't want or need that? It does this by:

  1. Reducing reliance on expensive purpose-built, ASIC-based networking hardware and associated pay-as-you-grow models that often result in costly overprovisioning. In other words, you can unlock more value from your network.
  2. SDNs provide increased programmability that enables easier network scalability and system design and management
  3. Agility and flexibility. Everybody needs it, everybody wants it, SDN can deliver it. SDN enables organizations to quickly deploy new infrastructure, applications, and services faster than a traditional network would allow.

OpenFlow
Often people use OpenFlow and SDN interchangeably, but they are not the same. OpenFlow is only a one element in the overall SDN architecture. OpenFlow is an open standard for a communications protocol that enables the control plane to interact with the forwarding plane. As an open standard it's steered by the OpenFlow Consortium. OpenFlow is not the only protocol available or in development for SDN. The open source network OS, ONOS, led by The Open Networking Lab (ON.Lab) is another option.

Network Functions Virtualization (NFV)
This is another term that can mean different things to different people, depending on the industry. For our purposes, we'll focus on what it means to the telecom industry. To understand what has propelled the development of NVF, let's take a look at how the telecom industry has traditionally deployed their networks. For more than 30 years, telecoms have been relying on specially built systems, some of which saw them developing their own ASICs (via Cisco, F5 or Juniper) and proprietary operating systems (Cisco IOS, for example), and then having that technology built into base stations, routers and Ethernet switches, all optimized for their use. The proprietary nature of all of this translates into very expensive systems and slower development cycles.

Fast forward to today's NFV initiative, which is spearheaded by several of the major telecommunications service providers. The value of NFV is in creating a standards-based approach to virtualizing key telecom applications, radically changing the way telecom networks are built and managed. By doing this, NFV enables those apps to run on industry standard servers. And that of course translates into big cost savings and more flexibility than was previously possible.

What has made NVF suitable for use with commercial off-the shelf (COTS) equipment are the advances made in underlying technology including SDN, faster fabrics (40Gb Ethernet), and more powerful processors.

NFV can be implemented without SDN, although the two solutions can work together. NFV is able to support SDN by providing the infrastructure upon which the SDN software can be run. Both technologies share a common objective, and that is to run on lower cost COTS servers and switches.

Source: etsi.org, whitepaper, Network Functions Virtualization:  An Introduction, Benefits, Enablers, Challenges & Call for Action, Oct. 2012

The OpenCompute Project (OCP)
The OCP is a Facebook-led initiative to build computing infrastructures that are energy efficient, easily scalable and low cost. The initiative was born out of the design and build of the massive Facebook data center based in Prineville, Oregon. Following in the footsteps of open source software, the OpenCompute designs are open, shared and available for all to use. The OCP includes software, servers, storage, networking, and data center designs. By utilizing OCP open hardware designs, the OCP claims the Facebook Prineville data center delivered 38 percent better efficiency and was 24 percent less expensive to build and run than other state-of-the-art data centers that use proprietary components. Pretty compelling stuff.

As you can see, there are recurring themes spanning all the aforementioned technologies. In case you missed them: low-cost, energy efficient, non-proprietary, open, scalable, flexible, and agile. Even if you are not looking at redesigning your data center now, you may need to in order to stay competitive.

No matter what technology you choose to deploy, one thing is for sure, the cloud is stressing I/O and I/O bottlenecks will be shifted from where they are today. The further away you get from processing, latency becomes more of a challenge. To plan for the barrage of new technology that's coming your way, look for technologies that reduce latency such as RDMA over Converged Ethernet (RoCE). Also seek out solutions that enable flexible usage of resources and that don't lock you into long-term commitments such as specialized appliances, infrastructure and proprietary software so that you are in a better position to take advantage of new innovations as they become available. Now strap yourself in and get ready for the ride.

Reference:

1.       Open Networking Foundation: "Software-Defined Networking: The New Norm for Networksm" April 13, 2012. Retrieved August 22, 2013.

More Stories By Barbara Porter

Barbara Porter is Senior Product Marketing Manager at Emulex. She has been with Emulex since 2009, bringing more than 15 years of experience to the company. Prior to Emulex, she was product line manager at Quantum, and software marketing manager at MSC Software. Barbara holds of Bachelor of Commerce degree from Griffith University in Australia.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection. In his session at 18th Cloud Expo, Bradley Holt, a Developer Advocate with IBM Cloud Data Services, discussed...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...