Welcome!

SDN Journal Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Containers Expo Blog, Cloud Security, SDN Journal

@DevOpsSummit: Blog Feed Post

The Intersection of Security and SDN

If you cross log analysis with infrastructure integration you get some interesting security capabilities

A lot of security-minded folks immediately pack up their bags and go home when you start talking about automating anything in the security infrastructure. Automating changes to data center firewalls, for example, seem to elicit a reaction akin not unlike that to a suggestion to putting an unpatched Windows machine directly on the public Internet.

At RSA yesterday I happened to see a variety of booths with a focus on .. .logs. That isn't surprising as log analysis is used across the data center and across domains for a variety of reasons. It's one of the ways databases are replicated, it's part of compiling access audit reports and it's absolutely one of the ways in which intrusions attempts can be detected.

And that's cool. Log analysis for intrusion detection is a good thing. But what if it could be better?

What if we started considering operationalizing the process of acting on events raised by log analysis?

One of the promises of SDN is agility through programmability. The idea is that because the data path is "programmable" it can be modified at any time by the control plane using an API. In this way, SDN-enabled architectures can respond in real time to conditions on the network impacting applications. Usually this focuses on performance but there's no reason it couldn't 'be applied to security, as well.

If you're using a log analysis tool capable of performing said analysis in near-time, and the analysis results in suspicious activity, there's no reason it couldn't inform a controller of some kind on the network, which in turn could easily decide to enable infrastructure capabilities across the network. Perhaps to start capturing the flow, or injecting a more advanced inspection service (malware detection perhaps) into the service chain for the application.

In the service provider world, it's well understood that the requirement in traditional architectures to force flows through all services is inefficient. It increases the cost of the service and requires scaling every single service along with subscriber growth. Service providers are turning to service chaining and traffic steering as a means to more efficiently use only those services that are applicable, rather than the entire chain.

While enterprise organizations for the most part aren't going to adopt service provider architectures, they can learn from then the value inherent in more dynamic network and service topologies. Does every request and response need to go through every security service? Or are some only truly needed for deep inspection?

It's about intelligence and integration. Real time analysis on what is traditionally data at rest (logs) can net actionable data if infrastructure is API-enabled. It's taking the notion of scalability domains to a more dynamic level by not only ensuring scale of services individually to reduce costs but further to improve performance and efficiency by only consuming resources when necessary, instead of all the time. The key is being able to determine when it's necessary and when it isn't.

More reading on infrastructure architecture patterns supporting scalability domains

In a service provider world that's based on subscriber and traffic type. In the enterprise it's more behavioral analysis, it's what someone is trying to do and with what application or data.

But in the end, both environments need to be dynamic with policy enforcement and service invocation based on the unique combination of devices, networks and applications and enabled by the increasing prevalence of API-enabled infrastructure.

SDN is going to propel not just operational networks as a cost savings vehicle, but as part of the technology that ultimately unlocks the software-defined data center. And that includes security.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

CloudEXPO Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. SD-WAN helps enterprises to take advantage of the exploding landscape of cloud applications and services, due to its unique capability to support all things cloud related.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and GM, discussed how clients in this new era of innovation can apply data, technology, plus human ingenuity to springboard to advance new business value and opportunities.