|By Marten Terpstra||
|January 10, 2014 10:00 AM EST||
For as long as I remember networking has struggled with the balance between aggregated and individual traffic flows. Following the abilities of the technology components we use, we have been forced to aggregate, only to be allowed to de-aggregate or skip aggregation when technology caught up or surpassed the needs of today.
The vast majority of networking equipment is driven by specialized hardware. For datacenter switches, speed and port density are driving the requirements and physics and our technology capabilities create trade-offs that ultimately lead to some form of aggregation. Higher speed and more ports are traded off against memory, table space and functionality. These trade-offs will always exist, no matter what we are trying to build. Networking based in servers will have oodles of memory and table space to do very specific things for many many flows, making it extremely flexible, but those same servers cannot touch the packet processing speeds of the specialized packet processing hardware from Broadcom, Intel or Marvell, or the custom ASICs from Cisco, Juniper, or most anyone else.
So like it or not, we will want to do more than our hardware is capable of and as a result, we create aggregation points in the network where we lump a bunch of flows together into an aggregate flow and start making decisions on those. Nothing new, even good ole IP forwarding is doing so on an aggregate set of flows, it only makes decisions for all flows destined to a specific IP address.
Network tunnels are the most obvious examples of aggregation, their purpose is to hide information from intermediate networking equipment. In some cases we hide it to keep our table sizes under control, in some cases we hide it because we do not want the intermediate equipment to be able to see what we are transporting (IPSec, SSL, etc). And while sometimes the intermediate systems can see everything that is there, managing the complexity of that visibility simply becomes too expensive. This is why networks that are entirely managed and controlled per flow do not really exist at any reasonable scale, and probably never will.
For the exact same reason we aggregate, we lose the ability to act on specifics. When our tables are not large enough to track each and every flow, we can only make decisions based on what we have decided to keep in common. When talking about tunnels, the tunnel endpoints put new headers onto the original packets and intermediate systems can only act (with minor exceptions) on the information provided in these new headers. The original detail is still there and often visible to the intermediate system, but the intermediate system does not have the capacity to act on the sheer volume of that detail.
And there is the struggle. If I have more information, I can make better decisions. But when I aggregate because I cannot handle that extra information (due to sheer size or management complexity), my decisions by definition become more coarse and as a result, less accurate. But we want it all. We want the power to make decisions based on the most specific information we can, but want to aggregate for operational simplicity or because our hardware dictates. And this is where we get creative and start to turn what used to be black and white into gray.
There is nothing wrong with attempting to act on specifics for aggregate flows, but in so many cases its done as an afterthought and becomes hard to manage, control or specify. Some of the techniques we use are fairly clean, like taking the DSCP values from a packet and replicating it in the outer header of that same packet in a tunnel. Others are far more obscure like calculating some hash function on a packet header and using it as the UDP source port for the VXLAN encapsulated version of that packet. In even others, the original internals may be completely invisible to intermediate systems. STT for instance re-uses the format of TCP packets for its own purpose, but as a side effect of using it as a streaming-like protocol is that the original packet headers may not actually be in an IP packet on the wire. The STT header provides for a 64 bit Context-ID that can be used to take some bits of information from the original packet, but that STT header only appears in the first of what could be many individual packets that are re-assembled in the receiving NIC. Over the Christmas break I spent some time looking at each of the overlay formats and the tools modern day packet processors give you to act on these headers. I will share some of this in this forum next week.
Ultimately, overlay networks are creating a renewed emphasis on the choices between aggregation and individuality. Designed specifically to allow for more complex and scaled networks that hide a lot of the details from the dedicated network hardware, it comes with the price of less granular decisions by that hardware, which can certainly lead to less than optimal use of the available network.
[Today's fun fact: In the Netherlands, there is a 40% higher chance of homeowner insurance claims on the home owner's birthday. Those are some good parties.]
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Mar. 1, 2015 05:00 PM EST Reads: 1,208
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Mar. 1, 2015 04:00 PM EST Reads: 1,439
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Mar. 1, 2015 04:00 PM EST Reads: 1,222
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Mar. 1, 2015 03:15 PM EST Reads: 1,343
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 1, 2015 02:00 PM EST Reads: 1,323
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Mar. 1, 2015 01:45 PM EST Reads: 1,204
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
Mar. 1, 2015 01:15 PM EST Reads: 1,105
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Mar. 1, 2015 01:00 PM EST Reads: 1,557
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Mar. 1, 2015 01:00 PM EST Reads: 1,184
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Mar. 1, 2015 12:00 PM EST Reads: 2,558
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Mar. 1, 2015 12:00 PM EST Reads: 2,365
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Mar. 1, 2015 12:00 PM EST Reads: 1,257
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Mar. 1, 2015 12:00 PM EST Reads: 1,912
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 1, 2015 11:00 AM EST Reads: 6,954
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 1, 2015 11:00 AM EST Reads: 2,721
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Mar. 1, 2015 10:00 AM EST Reads: 4,748
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Mar. 1, 2015 09:45 AM EST Reads: 2,196
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Mar. 1, 2015 09:30 AM EST Reads: 1,664
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Mar. 1, 2015 09:00 AM EST Reads: 1,273
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Mar. 1, 2015 09:00 AM EST Reads: 2,909