SDN Journal Authors: Yeshim Deniz, Elizabeth White, Liz McMillan, Pat Romanski, TJ Randall

Blog Feed Post

The Relevance of OpenDaylight’s Hydrogen Release

OpenDaylight’s first release, appropriately named the Hydrogen release, is imminently available. This marks the initial release of the networking industry’s largest open source SDN initiative. While most of the attention will naturally flow towards the newly-minted controller’s capabilities, the relevance of the Hydrogen release extends well beyond mere features and lines of code.

OpenDaylight’s first release of functional code is most notable because it is a release of functional code. Because of its promise to unify heterogeneous environments under a well-orchestrated fabric of automated workflows, SDN has become synonymous with Open Networking. The requirement for widespread interoperability has put a lot of SDN’s early architectural emphasis on industry standards.

Indeed, the most prominent industry body during SDN’s formative stages has been the Open Networking Foundation (ONF). This body, which includes a rich collection of both customers and vendors, has been integral in providing specifications for the OpenFlow protocol. The general idea is that wide adoption of a specified standard will allow companies to build towards a common blueprint, necessary if the end goal relies on any sort of interoperability.

One of the challenges with standards is that they have historically moved at a glacial pace. Consensus (or even rough consensus) based groups don’t move forward until the majority of people agree on a direction. When a technology is mature, ideas are well vetted, and code has been in deployment for some time, getting consensus is markedly easier. But when a technology is still nascent and the outcomes undetermined, how does a group reach consensus?

OpenDaylight is notable because, as a body, it is chartered not with defining standards but with delivering code. As such, it stands out among its organizational peers because its emphasis is putting nascent technology into practice. That alone makes the seminal OpenDaylight release meaningful.

That is not to say that releasing code is necessarily more important than identifying standards and ensuring interoperability; there is, in fact, a place for both. But the path to widely adopted standards will be made meaningfully easier if the industry can collaborate on the code that supports those standards. It creates a common sandbox, forged in an open source community and supported across an array of vendor devices, in which new technologies can be developed, tested, and ultimately adopted or discarded. Standardization requires experimentation, and OpenDaylight is providing the most fertile laboratory in the industry.

The Hydrogen release’s relevance extends beyond the various industry bodies though. Its commercial impact could be even more significant.

In the networking industry, from the time that a concept is identified to the first instance of a shipping product typically measures on the order of three years. OpenDaylight was formed in April 2013, which puts the time to its version 1.0 product at right around 10 months. When evaluated as a commercial entity, OpenDaylight has been able to get a first generation product out in a little less than one-third the time of most funded startups. And this includes all of the peripheral efforts required to set up an open source community and deal with a wide-ranging set of individual member interests. By any measure, that OpenDaylight’s first version of product is available so quickly is a feat in and of itself.

The most impactful point here is not the time to first release but rather what this means about the overall product trajectory. If the path from inception to first release is so fast, what does this foretell for future releases? It would appear that OpenDaylight is capable of covering more ground faster than its startup counterparts in the commercial SDN controller space.

For would-be OpenDaylight users, this means that there are two comparisons that need to be made: first, what are the various standalone controllers capable of today, and second, what will they be capable of one or two years out in the future?

Long-term commercial success is rarely a function of a static set of capabilities. Real success, especially in a rapidly evolving technology space, is dependent on the ability to continuously innovate. This would seem to favor OpenDaylight, who can not only draw ideas from all of the member companies, but who can also pull development resources from a broader open source community not hamstrung with internal corporate politics that dictate things like feature prioritization and R&D funding.

So while the initial wave of attention that OpenDaylight’s Hydrogen release will garner will undoubtedly focus on individual capabilities, it is likely the fact that there is working code at all, along with the trajectory for that code’s development, that will have the most profound impacts on the networking industry at large.

[Today's fun fact: The mask used by Michael Myers in the original "Halloween" movie was actually a Captain Kirk mask painted white. So if you are freaked out by William Shatner, now you know why.]

The post The Relevance of OpenDaylight’s Hydrogen Release appeared first on Plexxi.

Read the original blog entry...

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

CloudEXPO Stories
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully been able to harness the excess capacity of privately owned vehicles and turned into a meaningful business. This concept can be step-functioned to harnessing the spare compute capacity of smartphones that can be orchestrated by MEC to provide cloud service at the edge.
All zSystem customers have a significant new business opportunity to extend their reach to new customers and markets with new applications and services, and to improve the experience of existing customers. This can be achieved by exposing existing z assets (which have been developed over time) as APIs for accessing Systems of Record, while leveraging mobile and cloud capabilities with new Systems of Engagement applications. In this session, we will explore business drivers with new Node.js apps for delivering enhanced customer experience (with mobile and cloud adoption), how to accelerate development and management of SoE app APIs with API management.
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addressed the challenges of scaling document repositories to this level; architectural approaches for coordinating data; search and storage technologies, Solr, and Amazon storage and database technologies; the breadth of use cases that modern content systems need to support; how to support user applications that require subsecond response times.
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage of data and analytics in the cloud, Architecture, integration, governance and security scenarios and Key challenges and success factors of moving data and analytics to the cloud
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properly managed, is poised to bring about a digital transformation to enterprise IT. We will discuss the trend, the technology and the timeline for adoption.