Welcome!

SDN Journal Authors: Pat Romanski, Elizabeth White, Liz McMillan, Mark Hoover, Stefan Bernbo

Blog Feed Post

Plexxi Pulse—SDN Edges into Federal Agencies

As SDN gains traction within the private sector we are also seeing federal agencies adopt it as they identify the need for network infrastructure changes. While it may be instinctual to say ‘yes’  to upgrading the network to include every new additional feature offered, especially when managing large amounts of traffic, focusing on simplicity and instinctiveness is often the better option. We revisit this in-depth more below – let us know if you agree with our points. Enjoy!

In this week’s PlexxiTube of the week, Dan Bachman explains how Plexxi’s big data fabric solution is managed in comparison to more traditional tiered architectures.

Federal IT Networks: Simpler Is Better

In an article for Information Week, Elena Malykhina cites network complexity as a challenge amongst federal agencies. The biggest reason that complexity grows unchecked in situations like this is that people typically add to their IT infrastructure with more frequency than they subtract from it. As a result, you end up constantly adding things. When something doesn’t work quite right, you add a workaround rather than digging into the actual problem and identifying the best solution. Alternatively, when you’re in need of a new capability, it’s instinctive just to add a new feature. This incremental growth leads to IT sprawl. We need to be removing things as frequently as we add them (and arguably more frequently since we have accumulated architectural debt).

With additional complexity, interoperability ends up suffering. For example, if you deploy 600 features, even if both solutions support 599 of them, number 600 can prevent seamless interoperability. Part of the hope of SDN is that it levels the architectural playing field. It removes the reliance on these features. SDN, however, does require a rethinking of architecture and procurement practices.

So, what’s a good first step? Consider starting from scratch rather than just adding SDN as a new section to an already-overloaded document.

Refresh Cycles Lag As Enterprises Retain More Legacy Network Gear

Jessica Scarpati reported that nearly half of existing network devices are aging or obsolete in an article this week for SearchNetworking. Jessica’s point in this article isn’t terribly surprising. In my opinion, failures happen when things change, not necessarily just because of age. When people find a configuration or architecture that works, they don’t touch it. Interestingly, this is why the most stable time of year is Christmas – when employees go home and don’t touch anything.

Some of the refresh cycle delays that Jessica comments on are tied at least in part to SDN, just not in the way that people think. In my opinion, these decisions get delayed when there are more options. This is why fast food restaurants typically go with more limited menus; people move faster when they have fewer choices. SDN brought a lot of new players into the space, which actually increases choice. This will necessarily lengthen the evaluation cycle, even if people are only considering one or two additional players beyond their incumbent.

Internet of Overwhelming Things

In an opinion piece for Network World, Pete Bartolik referenced SDN as a vehicle to generate openness in the datacenter. While I agree that SDN can generate openness, I think there are two other major things that SDN provides: Improved intelligence through central control and automated workflows.

The former is actually only interesting in cases where the underlying shuffling of packets is different than today. Having more intelligence but using the same basic forwarding constructs does not result in significant change. Part of the appeal of fabrics (and why Brocade sponsored this post no doubt) is that you can do intelligent things within the fabric to better deal with how traffic is shunted. Legacy networks are built around protocols that are more than 50 years old (literally!). The future cannot be built entirely of the current set of building blocks.

Secondly, the question for users will be what to automate. Too many people view automation as scripts and keystroke removal. Automation really ought to be about smoothing out the boundaries between systems. The biggest gains will not happen if it is confined to just the network. Automation needs to reach out into compute and storage and applications in the fullness of time.

Agencies Set Building Blocks of the Software-Defined Enterprise

John Moore notes that government agencies are experimenting with SDN and converged infrastructure systems in a recent article for GCN. The software part of software-defined anything is interesting, but it not so much as the capabilities that come as a result. It is these capabilities that are driving interest among federal agencies.

Specifically, software-defined networking provides superior intelligence. To offer a metaphor, imagine that you are driving across a crowded metro area. You see brake lights and you roll to a stop. You might try a surface road, but the reality is that you don’t actually know if it will be any faster. You face a decision with unknown variables and maybe you play the odds. Now imagine that your best friend is in a helicopter and he has the capability to tell you where to go. That’s what an SDN controller can do. It provides a global view of the network, and that’s what these agencies are looking for. The intelligence is what the difference-maker will be.

 

The post Plexxi Pulse—SDN Edges into Federal Agencies appeared first on Plexxi.

Read the original blog entry...

More Stories By Mat Mathews

Visionary solutions are built by visionary leaders. Plexxi co-founder and Vice President of Product Management Mat Mathews has spent 20 years in the networking industry observing, experimenting and ultimately honing his technology vision. The resulting product — a combination of traditional networking, software-defined networking and photonic switching — represents the best of Mat's career experiences. Prior to Plexxi, Mat held VP of Product Management roles at Arbor Networks and Crossbeam Systems. Mat began his career as a software engineer for Wellfleet Communications, building high speed Frame Relay Switches for the carrier market. Mat holds a Bachelors of Science in Computer Systems Engineering from the University of Massachusetts at Amherst.

@CloudExpo Stories
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors an...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at Dell EMC, introduced a methodology for capturing, enriching and sharing data (and analytics) across the organization...
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
The proper isolation of resources is essential for multi-tenant environments. The traditional approach to isolate resources is, however, rather heavyweight. In his session at 18th Cloud Expo, Igor Drobiazko, co-founder of elastic.io, drew upon his own experience with operating a Docker container-based infrastructure on a large scale and present a lightweight solution for resource isolation using microservices. He also discussed the implementation of microservices in data and application integrat...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...