Click here to close now.

Welcome!

SDN Journal Authors: Liz McMillan, Elizabeth White, Pat Romanski, Yeshim Deniz, Carmen Gonzalez

Related Topics: SDN Journal, Java IoT, @MicroservicesE Blog, Linux Containers, @ContainersExpo Blog, CloudExpo® Blog, Cloud Security, BigDataExpo® Blog

SDN Journal: Article

Bringing Software-Defined to the Data Center

Lower costs and increase control

"Software-defined," like any new trend that technology companies rush to attach to has suffered from marketing hype. Starting in mid-2012 with the acquisition of Nicera by VMware, most traditional infrastructure technology vendors across compute, networking, and storage have some messaging around how software-defined fits into their product strategy.

But software-defined is not a traditional concept. Since the transition from mainframe to distributed computing, corresponding with the rise in networking, most technologies in the data center have been very hardware specific. For many years, the way to get the right amount of intelligence in the right place to execute the functionality has been with specialized hardware.

Software-defined is fairly self-explanatory. The value is in the software, with one of the biggest benefits being the use of standard hardware that is a fraction of the cost of vendor-specific hardware, which has been the norm for data centers since the beginning of the 21st century. Standard hardware from servers to networking devices have so many resources available that specialized hardware no longer provides the differentiation it once did.

Freedom from hardware opens up freedom to extend SDDC outside the walls of the data center. SDDC helps organizations deliver a modern private cloud in the same model that large operators like Amazon and Rackspace use to deliver public cloud. What's more, using the right software-defined technologies enables hybrid cloud, the panacea for most enterprises. This is one of the main reasons that open source technologies, and those with very strong standards are emerging in the software-defined space.

That said, software defined has the biggest opportunity to benefit private data centers, where the majority of applications simply cannot run in the public cloud based on policy or preference. Here are some best practices when looking at how software defined can benefit your private data center architecture:

Take advantage of your already efficient procurement: Odds are your organization has a go-to vendor or reseller of servers. Whether from HP, Dell, or a white box vendor like Supermicro, the premium paid on server infrastructure is much smaller than with storage. You may even have a volume purchase agreement, making purchasing of hardware for software-defined storage even more affordable. Buying infrastructure for SDDC is no different than your channels for standard servers and networking equipment.

Insist on standards compliance to avoid future lock-in: SDDC is a real opportunity to reduce or eliminate lock-in among the technologies used in your data center. The best software-defined solutions are based on industry standards so freedom to change is retained. This is more than basic interoperability with a standard API, as that case still relies on the software vendor to keep up with changes. Avoid software-defined solutions that are vendor specific and limit your flexibility to integrate and innovate as this market continues to evolve.

Have a preference for open source technologies: Once dismissed by their proprietary competitors as immature, open source operating systems, middleware, application frameworks, and databases are now standards in the data center. The same trend will hold true for software-defined solutions. Using an open source technology does not preclude organizations from working with commercial vendors to support the success of open source in the data center architecture.

Where are the biggest opportunities to use software defined? We believe it is in storage, a solution area where very large premiums have been paid for many years, based on the perception that specialized hardware was the only way to keep data safe and available. The biggest operators have proven the opposite - they can be available to millions of concurrent users without downtime or losing data while using standard hardware and intelligent software.

The reality is data is simply growing too fast and must be retained too long, at a cost that in many cases must be as close to zero as possible. Plus, unstructured data is the fastest growing, fueled by SaaS applications and the transition to mobile devices. Any private data center today is in direct competition with the operators of large public clouds. Internal users demand the flexibility and operating costs the big operators have proven are possible, so private operators must use the same software-defined strategy to remain competitive.

"Utility Computing" was a hyped trend at the start of this century that was before its time. Some might classify SDDC in the same category, all hype. Objectively, SDDC is a natural extension of cloud, and should prove to be equally as disruptive to the data center architecture as cloud has been. Software used for compute, networking, and storage will need to be as standard as the servers you buy today in order to fit into your data center architecture of the future.

More Stories By Joe Arnold

Joe Arnold founded SwiftStack to deploy high-scale, open-source cloud storage systems using OpenStack. He managed the first public OpenStack Swift launch independent of Rackspace, and has subsequently deployed multiple large-scale cloud storage systems. He is currently building tools to deploy and manage OpenStack Swift with his firm, SwiftStack.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
2015 predictions circa 1970: houses anticipate our needs and adapt, city infrastructure is citizen and situation aware, office buildings identify and preprocess you. Today smart buildings have no such collective conscience, no shared set of fundamental services to identify, predict and synchronize around us. LiveSpace and M2Mi are changing that. LiveSpace Smart Environment devices deliver over the M2Mi IoT Platform real time presence, awareness and intent analytics as a service to local connecte...
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for companies without hyper-scale resources. In his session at 15th Cloud Expo, David Cauthron, CTO and Founder of NIMBOXX, highlighted how a mid-sized manufacturer of global industrial equipment bridged the gap from virtualization to software-defined services, streamlining operations and costs while connect...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
paradigm shifts in networking, to cloud and licensure, and all the Internet of Things in between. In 2014 automation was the name of the game. In his session at DevOps Summit, Matthew Joyce, a Sales Engineer at Big Switch, will discuss why in 2015 it’s complexity reduction. Matthew Joyce, a sales engineer at Big Switch, is helping push networking into the 21st century. He is also a hacker at NYC Resistor. Previously he worked at NASA Ames Research Center with the Nebula Project (where OpenSta...
The term culture has had a polarizing effect among DevOps supporters. Some propose that culture change is critical for success with DevOps, but are remiss to define culture. Some talk about a DevOps culture but then reference activities that could lead to culture change and there are those that talk about culture change as a set of behaviors that need to be adopted by those in IT. There is no question that businesses successful in adopting a DevOps mindset have seen departmental culture change, ...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the ...
While there are hundreds of public and private cloud hosting providers to choose from, not all clouds are created equal. If you’re seeking to host enterprise-level mission-critical applications, where Cloud Security is a primary concern, WHOA.com is setting new standards for cloud hosting, and has established itself as a major contender in the marketplace. We are constantly seeking ways to innovate and leverage state-of-the-art technologies. In his session at 16th Cloud Expo, Mike Rivera, Seni...
EMC Corporation on Tuesday announced it has entered into a definitive agreement to acquire privately held Virtustream. When the transaction closes, Virtustream will form EMC’s new managed cloud services business. The acquisition represents a transformational element of EMC’s strategy to help customers move all applications to cloud-based IT environments. With the addition of Virtustream, EMC completes the industry’s most comprehensive hybrid cloud portfolio to support all applications, all workl...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud cre...
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption...
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust...