Welcome!

SDN Journal Authors: Elizabeth White, Pat Romanski, TJ Randall, Yeshim Deniz, Liz McMillan

Related Topics: @DevOpsSummit, Containers Expo Blog, SDN Journal

@DevOpsSummit: Blog Post

Network Automation or Automated Networks? | @DevOpsSummit [#DevOps #SDN]

There is a difference... and it significantly impacts your strategy.

Do You Want SDN for Network Automation or Automated Networks?

\When SDN made its mainstream debut at Interop in 2012, there was quite a bit of excitement tempered by the reality apparent to some folks that technical limitations would impact its applicability above layer 2-3 and, perhaps even at layers 2-3 depending on the network.

But even then with all the hubbub over OpenFlow and commoditization of "the network" there were some of us who saw benefits in what SDN was trying to do around network automation. The general theory, of course, was that networks - bound by the tight coupling between control and data planes - were impeding the ability of networks to scale efficiently. Which is absolutely true.

Register For DevOps Summit FREE (before Friday) ▸ Here

But while an OpenFlow-style SDN focuses on laying the foundation for an automated network - one that can, based on pre-defined and well understood rules, reroute traffic to meet application demands - the other more operationally focused style seeks to enable network automation.  The former focuses on speeds and feeds of packets, the latter on speeds and feeds of process.

The impact on the operational side of the network from the complexity inherent in a device by device configuration model means service velocity is not up to snuff to meet the demand for applications that would be created by the next generation of technologies. Basically, provisioning services in a traditional network model is slow, prone to error and not at all focused on the application.

Network automation, through the use of open, standards-based APIs and other programmability features, enables automated provisioning of network services. A network automation goal for SDN addresses operational factors that cause network downtime, increase costs and generally suck up the days (and sometimes nights) of highly skilled engineers.

why sdn sourced titledAlso complicating the matter is the reality that an OpenFlow-style SDN simply cannot scale to handle stateful network services. Those are stateful firewalls, application load balancing, web application firewalls, remote access, identity, web performance optimization, and more. These are the services that reside in the data path by necessity because they must have complete visibility into the data to perform their given functions.

While an architectural model can be realized that addresses both stateless and stateful services by relying on service chaining, the complexity created by doing so may be just as significant as the complexity that was being addressed in the first place.

So the question is, what do you really want out of SDN? What's the goal and how are you going to measure success?

The answer should give you a clearer idea on which SDN strategy you should be considering.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

CloudEXPO Stories
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage in priority areas: customer analytics, financial crime prevention, regulatory compliance and risk management.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single application runs across clouds remains elusive to most organizations. As companies eagerly seek out ways to make the multi cloud environment a reality, these new updates from Nutanix provide additional capabilities to streamline the implementation of their cloud services deployments.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secures more than 4,000 modern applications for its Enterprise customers around the world.
Rafay enables developers to automate the distribution, operations, cross-region scaling and lifecycle management of containerized microservices across public and private clouds, and service provider networks. Rafay's platform is built around foundational elements that together deliver an optimal abstraction layer across disparate infrastructure, making it easy for developers to scale and operate applications across any number of locations or regions. Consumed as a service, Rafay's platform eliminates the need to build an in-house platform or developing any specialized compute distribution capabilities. The platform significantly simplifies the deployment of containerized apps anywhere. Organizations can now achieve their desired levels of reliability, availability and performance with any combination of public cloud environments through a developer-friendly SaaS offering. From deploying ...