Welcome!

SDN Journal Authors: Liz McMillan, Pat Romanski, Elizabeth White, Yeshim Deniz, TJ Randall

Related Topics: @DevOpsSummit, Containers Expo Blog, @CloudExpo, SDN Journal

@DevOpsSummit: Blog Feed Post

Integrate Network Services By @LMacVittie | @DevOpsSummit [#DevOps]

Polymorphism is a concept central to object-oriented programming

How You Integrate Network Services Matters

This post is cross-posted at https://blogs.cisco.com/datacenter/how-you-integrate-network-services-matters

Polymorphism is a concept central to object-oriented programming. The notion of polymorphism is used to extend the capabilities of a basic object, like a mammal, to specific implementations, like cats or dogs or honey badgers, even though they don't care about such technical distinctions. A good example of this is cats and dogs, which are both of the type "mammal" but that "speak" in a different voice.

This becomes important as we consider the way in which Cisco Application Policy Infrastructure Controller (APIC) enables the extension of automation across the network, particularly to the application layers (L4-7), because it deviates from traditional protocol-based methods as a way to facilitate and automation service insertion in a common way without limiting the robust capabilities of best-of-breed solutions. In other words, it doesn't require all mammals to speak in the same voice.

clip_image001

Traditional protocol-based methods rely on a common data model. A TCP packet, for example, contains a specific set of headers that describe a variety of options and characteristics of the flow. The specific format is prescribed by RFCs and no deviation is allowed. Network integration has generally followed this model and you can see the results in a variety of ongoing efforts to provide orchestration and automation across the network. All devices are treated like mammals. There are no dogs, there are no cats, and there are certainly no honey badgers. The result is a commoditized set of network capabilities which do not allow the differentiation in services or enable the per-application attention required to address application-specific challenges with security, performance and scalability.

Which brings us back to Cisco APIC and its Application Centric Infrastructure (ACI) approach, which lets honey badgers be honey badgers and cats be cats while still both being mammals.

The Cisco ACI approach is very object-oriented. Its integration model requires the existence of a set of functions, but in no way prescribes how those functions act. This means that a variety of solutions in the same market can all integrate with Cisco APIC, but any capabilities that go above and beyond the lowest common denominator are not lost. Because of the dynamic nature of the integration - via device packages that can be loaded at any time - that also means that integrations can continue to be developed that enable even greater ranges of flexibility and choices for customers. In other words, you aren't stuck with just cats, dogs or honey badgers. You can also bring in guinea pigs, rabbits and horses, as long as they're mammals and each implements the basic set of functions required of a device package deployable on Cisco APIC.

It is that extensibility that has enabled F5 to continue to expand the choices available for integrating the automation of L4-7 application service insertion with Cisco APIC. Initially our focus was on direct integration with BIG-IP, providing the means by which prescriptive provisioning and configuration of services was easily accomplished. But the reality is that the applications central to driving the application economy are not all one size fits all. An approach that enables more specific, per-application service provisioning is necessary to achieve the operationalization of app deployments needed to relieve the increasing pressure faced by 9 out of 10 executives to release apps more quickly (CA and Vanson Bourne, Global Application Economy Study 2014).

Because of the approach Cisco has taken to enabling that provisioning via Cisco APIC, F5 Networks is able to provide another integration path through its orchestration and management solution, BIG-IQ. This new integration option facilitates the use of per-app service templates, iApps, to ensure not only rapid deployment but custom and consistent configuration. Consistency is an important capability necessary for maintaining stability in an infrastructure ultimately responsible for delivering the hundreds of applications supported by the average enterprise that must be balanced against the need for faster, more frequent deployments.  Customization is required by the very concept of application-centricity, as no two applications are alike in terms of the services and characteristics of those services required to meet business and customer expectations.

clip_image003This application-focused approach to provisioning allows network and application operators alike to codify per-application service requirements along with common policy such as base security using an app template approach. These templates then become the core of a custom device package that can be loaded and executed via Cisco APIC, resulting in a rapid, consistent deployment of the app services critical to ensuring the performance, security and scalability of the applications driving the application economy.

We are as excited today with the introduction of our BIG-IQ integration with Cisco APIC as we were with our BIG-IP integration. We're particularly pleased with Cisco's model of integration precisely because it enables us to continue to protect our customers' investment in the technologies and capabilities that go above and beyond the basics when it comes to delivering application services.

We also have workshops for you to attend in case you happen to be in Toronto March 24, Montreal March 25 or Ottawa March 26. Register now for the Cisco and F5 Synthesis Workshop: Accelerating Application Deployments.

Related Links

www.cisco.com/go/aci

www.cisco.com/go/acif5

https://f5.com/solutions/enterprise/reference-architectures/cisco-ac

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

CloudEXPO Stories
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, Alex Lovell-Troy, Director of Solutions Engineering at Pythian, presented a roadmap that can be leveraged by any organization to plan, analyze, evaluate, and execute on moving from configuration management tools to cloud orchestration tools. He also addressed the three major cloud vendors as well as some tools that will work with any cloud.
Transformation Abstract Encryption and privacy in the cloud is a daunting yet essential task for both security practitioners and application developers, especially as applications continue moving to the cloud at an exponential rate. What are some best practices and processes for enterprises to follow that balance both security and ease of use requirements? What technologies are available to empower enterprises with code, data and key protection from cloud providers, system administrators, insiders, government compulsion, and network hackers? Join Ambuj Kumar (CEO, Fortanix) to discuss best practices and technologies for enterprises to securely transition to a multi-cloud hybrid world.
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right provider and with the proper expectations. In his session at 18th Cloud Expo, Christo Kutrovsky, a Principal Consultant at Pythian, compared the NoSQL and SQL offerings from AWS, Microsoft Azure and Google Cloud, their similarities, differences and use cases for each one based on client projects.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and Big Data teams at Autodesk. He is a contributing author of book on Azure and Big Data published by SAMS.