Welcome!

SDN Journal Authors: Rishi Bhargava, Pat Romanski, ManageEngine IT Matters, Elizabeth White, Steven Lamb

Related Topics: SDN Journal, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, Cloud Security

SDN Journal: Blog Post

Network Services, Abstracted and Consumable

The network has traditionally been very static and simplistic in its offerings

Perhaps not as popular as its brothers and sisters I, P and S, Network-As-A-Service or NaaS has slowly started to appear in industry press, articles and presentations. While sometimes associated with a hypervisor based overlay solution, its definition is not very clear, which is not at all surprising. Our industry does not do too well in defining new terms. I ran across this presentation from Usenix 2012 that details a NaaS solution that adds a software forwarding engine to switches and routers that provide specific services for some well known cloud computing workloads.

I have some serious reservations about the specific implementation of the network services provided in this presentation, but the overall thoughts of specific network services delivered to applications and workloads resonates well with me. Unless this is your first visit to our blog, your reaction is probably “duh, this is what Affinity Networking is all about”. Of course it is.

The network has traditionally been very static and simplistic in its offerings. The vast majority of networks runs with an extremely small set of network services. Find me a network that uses more than some basic QoS based on queueing strategies, IP Multicast (and many understandingly avoid it as much as they can), and perhaps some VRFs and we will probably agree that that is an exception rather than a rule. And I deliberately exclude the actual underlying technologies to accomplish this, those don’t change the service, just enable it.

And it is not that networks are not capable of providing other services. Most hardware used is extremely capable of doing so much more, and in many cases even the configuration of that hardware is available. Extremely elaborate protocols exist to manage additional services, with new ones being developed constantly. And you can find paper after paper that show that specific network services can greatly improve the overall solution performance. Many of these examples are based on big data type solutions, but I am pretty sure that that translate into just about every solution that has a significant dependence on the network.

So why then do we not have a much richer set of network services available to the consumer of the networks?

There are probably multiple answers, but one that keeps bubbling to the top each time we look at this is one of abstraction. In simple terms, we have not made network services easy to create, easy to maintain, easy to debug, and most importantly, we have not made network services easy to consume. We talk about devops and the fact that the creation, debugging and maintenance of complex network services inside the core of a network is not at all trivial. Per the examples above, getting end to end QoS (consistent queuing really) in place seems like a simple task but is not. And that is technology that has been around for well over a decade. Configuring each and every switch to ensure it has the same queueing configuration and behaviors, adjust drop rates and queue lengths based on where a switch fits into the network and define what applications should fit into which queue is complex not because of the topic itself, but because of the amount of touch points, the amount of configuration steps, and the switch by switch, hop by hop mechanisms by which we deploy it. This is where devops will start.

But you also have to look at it from other side. In the first few slides of the above mentioned presentation, the presenter shows that the network engineer and the application engineer have wildly different views of the network. As they should. The application engineer should not need to know any of the ins and outs of the network and its behavior. He or she should be presented with an entity that provides connectivity, and a set of network services it offers. And it should be trivial to attach itself to any of these services without having to understand network terms. An application engineer should not need to know that DSCP bits need to be set to get a certain priority behavior. Or having to request from the network folks that a set of IP or ethernet endpoints require a lossless connectivity and must therefore be placed onto network paths that support PFC and QCN to enable RDMA over Ethernet or even FCoE.

These types of services need to become extremely easy to consume. The architect of a very large private cloud described his ideal model by which applications (and he supports thousands of them) would consume network services. He envisioned an application registration model (through a portal for instance) where application developers could express in extremely simple non network terms what their application needed. Connectivity between components X and Y. The use of specific memory systems that have been predefined to use RDMA over Ethernet (and thus require lossless connectivity). This application consists of N components that need PCI compliance and therefore need to be separated from the rest of the applications. You name it, application behavior in terms that are as far away from the actual implementation of the tools used to enable that service in the network.

There is lots of work to do on both ends of this consumable network service model. For the network engineer it needs to become much easy to enable these network services in a controllable and maintainable manner. Easy to design, easy to deploy, easy to debug and maintain. For the application engineer, it needs to become easy to consume these network services. Simple and scalable registration and request mechanisms without a lot of network terminology. My post office comparison from a few weeks ago was perhaps very simplistic, but you have to admit, using the USPS is pretty simple. You walk up to the counter, there is a menu of shipment options, each with a price and an expected result, you pick what you want, they charge you for it and off your package goes. And you don’t really worry or care too much how, just that it’s being delivered in accordance with the service you paid for….

[Today's fun fact: Stewardesses is the longest common word that is typed with only the left hand. As a result it has been banished in favor of flight attendant.]

The post Network Services, Abstracted and Consumable appeared first on Plexxi.

Read the original blog entry...

More Stories By Marten Terpstra

Marten Terpstra is a Product Management Director at Plexxi Inc. Marten has extensive knowledge of the architecture, design, deployment and management of enterprise and carrier networks.

@CloudExpo Stories
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...