Click here to close now.

Welcome!

SDN Journal Authors: Pat Romanski, Roger Strukhoff, Lori MacVittie, Carmen Gonzalez, Michael Jannery

Related Topics: SDN Journal, Virtualization, Web 2.0, Cloud Expo, Security, Big Data Journal

SDN Journal: Blog Feed Post

Square Pegs and Round Holes (Network and Applications)

One size does not fit all

Shannon Poulin, VP Data Center and Connected Systems Group and General Manager, Data Center Marketing Group for Intel gave the keynote address at Data Center World Fall on transforming the data center for a services-oriented world. Now, that was interesting enough in itself and of course it touched on SDN and cloud. But what really grabbed my attention was a focus on processor design and how decomposition of applications and the network into services and functions is changing the way processors and board-level components are designed.

You see, it turns out that one size does not fit all, and the varying resource and processing models of different types of "things" has an impact on how you put a machine together.

I mean, it's all well and good to say commoditized x86 is the future, and white box machines are going to be the basis for our virtualized data center, but the reality is that that is not a good design idea. Why? Because applications are not switches - and vice versa.

I/O versus COMPUTE
Switches are, by their nature, highly dependent on I/O. They need a lot of it. Like Gbps of it. Because what they do is push a lot of data across the network. Applications, on the other hand, need lots and lots of memory and processing power. Because what they do is mostly processing lots of user requests, each of which eats up memory. What Poulin discussed in his keynote was this diversity was not going unnoticed, and that Intel was working on processor and board design that specifically addressed the unique needs of networks and applications.

What's important for folks to recognize - and take into consideration - in the meantime is that one size does not fit all and that the pipe dream of a commoditized x86-based "fabric" of resources isn't necessarily going to work. It doesn't work as a "resource fabric" for both network and applications because the compute and network needs are vastly different for network functions and applications.

Which means no matter what you do, you can't have a homogenized resource fabric in the data center from which to provision willy nilly for both network and applications. You need specific sets of resources designated for high I/O functions and others for high processing and memory usage.

And, just to throw a wrench into the works, you've also got to consider that many "network" services aren't as "networky" as they are "application". They're in the middle, layer 4-7 services like application acceleration, load balancing, firewalling and application security. These are high I/O, yes, but they're also compute intensive, performing a variety of processing on data traversing the network.

compute versus io l47This was somewhat glossed over in Poulin's keynote. The focus on network versus compute is easier, after all, because there's a clear delineation between the two. Layer 4-7 services, though key to modern data centers, are more difficult to bucketize in terms of compute and I/O required.

Depending on what the application service (l4-7) is focused on - an application delivery firewall needs a lot of I/O to defend against network and application DDoS while a web application firewall needs lots of processing power to scan and evaluate data for threats - each of them may have different needs in terms of compute and network resources as well.

What I heard from Poulin is that Intel recognizes this, and is focusing resources on developing board level components like processors that are specifically designed to address the unique processing and network needs of both application services and network functions.

What I didn't hear was how such processors and components would address the unique needs of all the services and functions that fall in the middle, for which "general purpose" is not a good fit, but neither is a network-heavy or compute-heavy system. Indeed, the changing landscape in application architecture - the decomposition into services and an API layer over data approach - is changing applications, too. An API focused on data access is much more network (I/O) heavy than is a traditional web application.

I'm all for specialization as a means to overcome limitations inherent in general purpose compute when tasked with specialized functions, but let's not overlook that one size does not fit all. We're going to need (for the foreseeable future, anyway) pools (and/or fabrics) made of resources appropriate to the workloads for which they will be primarily responsible.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
FedRAMP is mandatory for government cloud deployments and businesses need to comply in order to provide services for federal engagements. In his session at 16th Cloud Expo, Abel Sussman, Director for Coalfire Public Sector practice, will review the Federal Risk and Authorization Management Program (FedRAMP) process and provide advice on overcoming common compliance obstacles.
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization mod...
Are your applications getting in the way of your business strategy? It’s time to rethink your IT approach. In his session at 16th Cloud Expo, Madhukar Kumar, Vice President, Product Management at Liaison Technologies, will discuss a new data-centric approach to IT that allows your data, not applications, to inform business strategy. By moving away from an application-centric IT model where data integration and analysis are subservient to the constraints of applications, your organization will b...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed...
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from me...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
IBM has announced that SoftLayer will offer OpenPOWER-based servers as part of its portfolio of cloud-based services. With the new offering, clients will be able to select OpenPOWER-based “bare metal” servers when configuring their cloud-based IT infrastructure from SoftLayer, an IBM company. Leveraging the OpenPOWER Foundation design concept, the servers were developed to help clients better manage data-intensive workloads on public and private clouds. Increasingly cloud technologies, bot...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, will demonstrate the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He will discuss from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT ...
SYS-CON Media announced today that Symantec, a provider of leading security, backup and availability solutions for where vital information is stored, accessed and shared, has launched new ad campaigns on SYS-CON's i-Technology sites, which include Cloud Computing Journal, DevOps Journal, Virtualization Journal, and IoT Journal. Symantec’s campaigns focus on Disaster Recovery and High Availability, the availability of business-critical applications in today’s complex heterogeneous environments, ...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...