Welcome!

SDN Journal Authors: Liz McMillan, Stefan Bernbo, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, @CloudExpo, @BigDataExpo, SDN Journal

@DevOpsSummit: Article

Chicken or Egg: Abstraction and Automation

But will abstraction necessarily lead to automation?

For many SDN and DevOps enthusiasts, the natural outcome of this wave of technological change is a highly-automated network that is well-orchestrated with surrounding systems and applications. One of the prevailing thoughts is that this level of automation is a well-formed abstraction layer. With the abstractions in place, the army of network engineers will be unencumbered by device configuration, and automation will ensure.

Or will it?

First off, let me say that abstraction is absolutely necessary. There is no doubt that networking will only advance if we can both remove unnecessary elements and simplify those that remain. We have to accomplish this in a way that is vendor (and ideally technology) agnostic. Abstraction is clearly the path forward.

But will abstraction necessarily lead to automation?
For the vast majority of network engineers who are designing and actively managing networks today, automation means writing shell or Perl scripts. The scripts themselves qualify as automation insofar as they remove keystrokes, but they basically execute the same serial logic that has dominated networking devices for decades.

When you want to make a switch or a router do something, you specify some configuration. Then you specify some other configuration. And again and again until you get through the litany of parameters that collectively make the device work. This workflow is so ingrained in our collective psyche that we inherently serialize the tasks required to make the network work.

There have actually been companies that have done a decent job of breaking the habit of serialized configuration. Juniper’s flagship operating system Junos moved to a more code-like representation of configuration, making no assumptions about the ordering of specific tasks. But our training runs deep, and even the Juniper guys will tell you that the biggest barrier to entry is familiarity with the UI.

We are addicted to our serialized behavior.
One of the side effects of highly-serialized configuration is that we tend to think extremely linearly and transactionally. There are a lot of network engineers for whom any kind of object-oriented approach is almost too foreign to really embrace. So when they try to automate tasks, they fall back into a sequencing of steps, repeated as many times as necessary. Automation without reuse is painfully difficult to propagate beyond only the most repetitive tasks.

And so we end up in a scenario where automation is basically synonymous with scripting, and where the value is largely applied only to the most frequently-executed tasks.

Where could automation take us?
If we think through where automation could take us, we ideally aim a little higher. Automation could mean the automated exchange of data between collaborative systems in support of some task. For instance, you might want your servers to communicate to your network so that when a new application is spun up (or a VM moves), you get corresponding policy changes, firewall or load balancer changes, and potentially network capacity allocation.

For most network engineers, the idea that infrastructure communicates and dynamically provides a service is science fiction. Our serialized mode of operation simply doesn’t support this kind of multilateral communication. Even if the abstractions remove some of the configuration complexity, the mental block is around sequencing.

If the current networking model has taught us anything, it should be that our network engineers are quite capable of managing tons of inputs and outputs. Now, whether that ought to be a requirement for the job is another question entirely. But as a group, network engineers are certainly capable of handling a lot of variables. That abstraction reduces these variables to the most meaningful is very interesting, but it wouldn’t seem that input management is the biggest bottleneck to automation.

The barrier to automation

Rather, the biggest barrier to automation is that workflow is so structured. First, it was the devices themselves that forced the structure. Then it was the processes (ITIL anyone?) that forced it. The end result is that we have built a discipline so dependent on structure that it actually impedes our own progress.

If we want to get to automation, we need to find a way to work around—or perhaps work within—this structure.

What we are really talking about is changing how we think about provisioning and managing a network. Why do you think there is so much angst when people talk about network engineers needing to learn to code? It’s because moving from a serialized set of steps to an object-oriented way of thinking about the problem is extremely difficult.

People aren’t pushing back because learning a new language is hard. Or at least they shouldn’t be. Look at any networking device configuration and tell me that you aren’t already a master coder. The biggest difference is that you you are using an interpreted language called Cisco CLI (or Junos CLI or whatever CLI).

What next?
What we need to do is bite off small (dare I say tiny) workflow elements, automate those, and then string them together into larger workflows. This implies a couple of things. First, we need to think less about discrete capabilities and more about how they exist within some broader workflow context. Second, we need to understand how these building blocks fit together. It’s the connecting of individual workflows that forms the basis for automation, and those connections highlight the pieces of information that flow across workflow boundaries.

More bluntly, the data that stitches together workflows ends up being the stuff that needs to be in an abstraction. It very well could be that getting the automation parts right will help us get to better abstractions.

Obviously we have to work the process from both ends – abstraction down, and workflow up. I don’t think it’s as simple as one or the other, which is why abstraction and automation might be a networking incarnation of the age-old chicken-and-egg question.

[Today’s fun fact: A parrot’s vocabulary is generally no more than twenty words. Who knew parrot’s and politicians had so much in common?]

The post Chicken or egg: Abstraction and automation appeared first on Plexxi.

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

@CloudExpo Stories
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
An IoT product’s log files speak volumes about what’s happening with your products in the field, pinpointing current and potential issues, and enabling you to predict failures and save millions of dollars in inventory. But until recently, no one knew how to listen. In his session at @ThingsExpo, Dan Gettens, Chief Research Officer at OnProcess, discussed recent research by Massachusetts Institute of Technology and OnProcess Technology, where MIT created a new, breakthrough analytics model for ...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
"At ROHA we develop an app called Catcha. It was developed after we spent a year meeting with, talking to, interacting with senior citizens watching them use their smartphones and talking to them about how they use their smartphones so we could get to know their smartphone behavior," explained Dave Woods, Chief Innovation Officer at ROHA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU's GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes. In...
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"Coalfire is a cyber-risk, security and compliance assessment and advisory services firm. We do a lot of work with the cloud service provider community," explained Ryan McGowan, Vice President, Sales (West) at Coalfire Systems, Inc., in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...