|By Marten Terpstra||
|January 10, 2014 10:00 AM EST||
For as long as I remember networking has struggled with the balance between aggregated and individual traffic flows. Following the abilities of the technology components we use, we have been forced to aggregate, only to be allowed to de-aggregate or skip aggregation when technology caught up or surpassed the needs of today.
The vast majority of networking equipment is driven by specialized hardware. For datacenter switches, speed and port density are driving the requirements and physics and our technology capabilities create trade-offs that ultimately lead to some form of aggregation. Higher speed and more ports are traded off against memory, table space and functionality. These trade-offs will always exist, no matter what we are trying to build. Networking based in servers will have oodles of memory and table space to do very specific things for many many flows, making it extremely flexible, but those same servers cannot touch the packet processing speeds of the specialized packet processing hardware from Broadcom, Intel or Marvell, or the custom ASICs from Cisco, Juniper, or most anyone else.
So like it or not, we will want to do more than our hardware is capable of and as a result, we create aggregation points in the network where we lump a bunch of flows together into an aggregate flow and start making decisions on those. Nothing new, even good ole IP forwarding is doing so on an aggregate set of flows, it only makes decisions for all flows destined to a specific IP address.
Network tunnels are the most obvious examples of aggregation, their purpose is to hide information from intermediate networking equipment. In some cases we hide it to keep our table sizes under control, in some cases we hide it because we do not want the intermediate equipment to be able to see what we are transporting (IPSec, SSL, etc). And while sometimes the intermediate systems can see everything that is there, managing the complexity of that visibility simply becomes too expensive. This is why networks that are entirely managed and controlled per flow do not really exist at any reasonable scale, and probably never will.
For the exact same reason we aggregate, we lose the ability to act on specifics. When our tables are not large enough to track each and every flow, we can only make decisions based on what we have decided to keep in common. When talking about tunnels, the tunnel endpoints put new headers onto the original packets and intermediate systems can only act (with minor exceptions) on the information provided in these new headers. The original detail is still there and often visible to the intermediate system, but the intermediate system does not have the capacity to act on the sheer volume of that detail.
And there is the struggle. If I have more information, I can make better decisions. But when I aggregate because I cannot handle that extra information (due to sheer size or management complexity), my decisions by definition become more coarse and as a result, less accurate. But we want it all. We want the power to make decisions based on the most specific information we can, but want to aggregate for operational simplicity or because our hardware dictates. And this is where we get creative and start to turn what used to be black and white into gray.
There is nothing wrong with attempting to act on specifics for aggregate flows, but in so many cases its done as an afterthought and becomes hard to manage, control or specify. Some of the techniques we use are fairly clean, like taking the DSCP values from a packet and replicating it in the outer header of that same packet in a tunnel. Others are far more obscure like calculating some hash function on a packet header and using it as the UDP source port for the VXLAN encapsulated version of that packet. In even others, the original internals may be completely invisible to intermediate systems. STT for instance re-uses the format of TCP packets for its own purpose, but as a side effect of using it as a streaming-like protocol is that the original packet headers may not actually be in an IP packet on the wire. The STT header provides for a 64 bit Context-ID that can be used to take some bits of information from the original packet, but that STT header only appears in the first of what could be many individual packets that are re-assembled in the receiving NIC. Over the Christmas break I spent some time looking at each of the overlay formats and the tools modern day packet processors give you to act on these headers. I will share some of this in this forum next week.
Ultimately, overlay networks are creating a renewed emphasis on the choices between aggregation and individuality. Designed specifically to allow for more complex and scaled networks that hide a lot of the details from the dedicated network hardware, it comes with the price of less granular decisions by that hardware, which can certainly lead to less than optimal use of the available network.
[Today's fun fact: In the Netherlands, there is a 40% higher chance of homeowner insurance claims on the home owner's birthday. Those are some good parties.]
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile ...
Jan. 26, 2015 09:00 PM EST Reads: 2,539
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
Jan. 26, 2015 07:45 PM EST Reads: 2,506
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
Jan. 26, 2015 07:30 PM EST Reads: 1,522
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what th...
Jan. 26, 2015 06:15 PM EST Reads: 3,862
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness,...
Jan. 26, 2015 06:00 PM EST Reads: 2,811
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Jan. 26, 2015 06:00 PM EST Reads: 1,509
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science f...
Jan. 26, 2015 06:00 PM EST Reads: 3,076
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
Jan. 26, 2015 06:00 PM EST Reads: 1,682
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Jan. 26, 2015 05:45 PM EST Reads: 3,117
Vormetric on Wednesday announced the results of its 2015 Insider Threat Report (ITR), conducted online on their behalf by Harris Poll and in conjunction with analyst firm Ovum in fall 2014 among 818 IT decision makers in various countries, including 408 in the United States. The report details striking findings around how U.S. and international enterprises perceive security threats, the types of employees considered most dangerous, environments at the greatest risk for data loss and the steps or...
Jan. 26, 2015 05:00 PM EST Reads: 1,533
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP ...
Jan. 26, 2015 05:00 PM EST Reads: 2,525
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
Jan. 26, 2015 04:45 PM EST Reads: 716
Cloud Technology Partners on Wednesday announced it has been recognized by the Modern Infrastructure Impact Awards as one of the Best Amazon Web Services (AWS) Consulting Partners. Selected by the editors of TechTarget's SearchDataCenter.com, and by votes from customers and strategic channel partners, the companies acknowledged by the Modern Infrastructure Impact Awards represent the top providers of cloud consulting services for AWS including application migration, application development, inf...
Jan. 26, 2015 03:00 PM EST Reads: 1,322
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 26, 2015 02:45 PM EST Reads: 2,252
Software AG and Wipro Ltd. have announced a joint solution platform for streaming analytics that provides real-time actionable intelligence for the Internet of Things (IoT) market. “The key to successfully addressing the IoT market is the ability to rapidly build and evolve apps that tap into, analyze and make smart decisions on fast, big data”, said John Bates, Global Head of Industry Solutions and CMO, Software AG. To address the huge market potential created by streaming analytics in conj...
Jan. 26, 2015 02:30 PM EST Reads: 618
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 26, 2015 02:30 PM EST Reads: 2,364
"Blue Box has been around for 10-11 years, and last year we launched Blue Box Cloud. We like the term 'Private Cloud as a Service' because we think that embodies what we are launching as a product - it's a managed hosted private cloud," explained Giles Frith, Vice President of Customer Operations at Blue Box, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 26, 2015 02:30 PM EST Reads: 2,469
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your onlin...
Jan. 26, 2015 02:15 PM EST Reads: 2,934
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
Jan. 26, 2015 02:00 PM EST Reads: 2,133
IBM and Docker, Inc. have announced a strategic partnership that enables enterprises to more efficiently, quickly and cost effectively build and run the next generation of applications on the IBM Cloud and on prem via the Docker open platform for distributed applications. Enterprises can use the combination of IBM and Docker to create and manage a new generation of portable distributed applications that are rapidly composed of discrete interoperable Docker containers, have a dynamic lifecycle, a...
Jan. 26, 2015 02:00 PM EST Reads: 2,130