Welcome!

SDN Journal Authors: Rishi Bhargava, Pat Romanski, ManageEngine IT Matters, John Basso, Steven Lamb

Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Open Source Cloud, Containers Expo Blog

SDN Journal: Blog Post

Changing the Way We Configure and Provision Our Networks

In enterprises we have never really made a big distinction between configuration and provisioning

Some people believe good or bad things always happen in threes. I believe you will always be able to find three (and probably more) things that are good or bad and somewhat related, but sometimes I get surprised by the apparent coincidental appearance of several closely related “things”. Last week the folks at networkheresy.com posted a second installment of their “policy in the datacenter” discussion, Cisco announced the acquisition of tail-f and internal to Plexxi we had several intense architectural discussions around Configuration, Provisioning and Policy management. Maybe we can declare June CP&P month for networking.

It is mostly accepted that configuration deals with the deployment of devices and applications within an infrastructure. For network devices, it covers the portions of creating a fabric, protocols to maintain this fabric, access and control to the device itself, management connectivity etc. Once a network device is configured, it is a functioning element in a network.

Provisioning is more a telco term, focused on creating the customer facing end of a device of applications. For network devices, this would cover ports that are facing customer or end devices, the edge protocols required, VLANs, IP subnets, etc.

Lastly, policy defines a level of communication service that is created across the configured infrastructure through the attached provisioned interfaces. It defines what communication is allowed and not allowed and with what specific service and service levels.

In enterprises we have never really made a big distinction between configuration and provisioning. But with the evolution to a virtualized infrastructure and more rapid client facing changes as a result of VM creation and movement, I believe the two have enough differences that it makes sense to adopt this separation more broadly.

I catch myself using them interchangeably at times, but there are very distinct differences between them, even if all of them end up as a set of instructions to a switch or set of switches, physical or virtual. The types of instructions are different for each type, the folks responsible for their functionality are different, and the rate of change is different.

More importantly though, the mechanism by which we instruct our network components is rapidly changing. The complexity of the configuration and policy components and the sheer volume and rate of change of the provisioning component is driving more automated methods of instructing our network elements what to do. We have long ago adopted centralized database driven CP&P systems in most our world.

Except for networking. A very large majority of instructions for network devices is still hand crafted, or script-assisted hand crafted. And the entirety of the instruction set for these devices lives on the switches itself. For which we have then created systems that capture and archive them for backup purposes and forensics. When something goes wrong with the device, we go find the latest backup we have and attempt to restore the service.

But this is all changing. Finally. Newer network solutions use centralized systems that have real databases behind them to power the information that needs to be stored, shared, backed up, check pointed, logged, replicated and all the other good stuff real databases have solved for many years.

The biggest hump we need to get past is one of control. Regardless of where or how CP&P data is stored, the more important question is who controls the data. Even if this data is stored on a v- or pSwitch, that same switch should not be the master of that information any longer. There are portions of CP&P information that have network wide meaning and should be specified in a network wide manner. Today we manually construct network wide provisioning or policy semantics. We have to get to a point where we define these in network wide terms and let our tools worry about what that means for the individual network elements that create the service.

We started this a few years ago with abstracted policies we call Affinities. Policies that are defined network wide, without any specificity to location or what elements to apply the policy to. In Plexxi Control, similar concepts exist for some of the provisioning: certain types of data we consider global, they are used and applied network wide without you worrying what elements it applies to.

Centralized or federated CP&P provides huge benefits and potential. All the backend tools exist to make this safe, replicated, audited and logged, better so than any legacy network system. If only we can get our minds past the change in control and not expect the element to be the master source of its data. It is one of the many things we need to accept to allow the network to change.

The post Changing the way we configure and provision our networks appeared first on Plexxi.

Read the original blog entry...

More Stories By Marten Terpstra

Marten Terpstra is a Product Management Director at Plexxi Inc. Marten has extensive knowledge of the architecture, design, deployment and management of enterprise and carrier networks.

@CloudExpo Stories
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...