Welcome!

SDN Journal Authors: Elizabeth White, Destiny Bertucci, Liz McMillan, Jignesh Solanki, Daniel Gordon

Related Topics: SDN Journal, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, @DXWorldExpo

SDN Journal: Article

Future Proofing the Data Center

How to become agile in a rapidly changing storage world

More and more of our lives are lived online. Our music collections, bookshelves, vacation memories and more are increasingly digitized and uploaded into the cloud, the vast network of server farms that provide the bulk of online storage today. Research firm Gartner projects that by 2016, 36 percent of consumer content will be stored on the cloud, up from a mere seven percent in 2011.

Service providers, watching these trends with a wary eye, will be required to accommodate ever-increasing demands for storage as consumer appetites for cloud content storage continues to grow. To adapt, many service providers are exploring new options in data center architecture that will permit greater flexibility and control over hardware costs.

One such option is software-defined storage. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates the dependency on server "appliances" with software inextricably baked in.

Software-Defined Storage: A Primer
Although the term "software-defined" may seem like a recent buzzword, many everyday electronic devices - such as personal computers - have been "software-defined" for years. In the example of a PC, software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. For example, the average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand - whether towards a high-powered graphic design setup, for example, or a lightweight web browser.

Despite these clear benefits in flexibility, data centers are one of the last frontiers for software-defined technologies. The reluctance to embrace the trend can be traced - as it so often is - to the initial expenditures required to make the switch. Given the sheer level of infrastructure in service providers' data centers - giant warehouses in multiple locations across the globe - the outlays required to switch systems represents a very high investment indeed. Yet there is no denying the consumer trend toward ever-higher rates of online content storage.

The Status Quo
Existing data center architecture is comprised mainly of appliances. In industry parlance, an appliance is server hardware with proprietary, mandatory software baked in. The software is designed for the hardware and vice versa, and come tightly wedded together as a package. This can be a benefit for data centers without staff who specialize in server technology and therefore lack the tools necessary to configure a custom server deployment in-house. Yet since hardware inevitably fails (at a number of points within the machine), traditional appliances typically include multiple copies of expensive components to anticipate and prevent failure. These extra layers of identical hardware extract higher costs in energy usage, and add layers of complication to a single appliance. Because the actual cost per appliance is quite high compared with commodity servers, cost estimates often skyrocket when companies begin examining how to scale out their data centers.

It's due in large part to problems with appliances that data center administrators are beginning to consider software-defined storage approaches. "Software-defined" is far from a new concept. To the data center ecosystem, however - accustomed to appliances with mandatory, pre-set software - a software-defined approach was almost revolutionary.

Data center administrators interested in learning more about software-defined solutions are typically attracted initially to one or more of the following benefits:

1. Cost Reduction
As with many solutions, convenience comes with a price. Traditional appliances offer convenience in the form of a baked-in, standardized package, but the added costs associated with several layers of complicated, mandatory software coupled with high-powered hardware can lead to significant cost outlays for a data center that needs to scale rapidly.

Conversely, software-defined storage liberates the software from the hardware, allowing administrators to choose inexpensive commodity servers. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users' growing demand for storage.

2. Flexibility
Not every data center is created equal. A telco servicing one particular area will have different storage needs than a major bank with branches in several countries, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, the benefits realized by fully uncoupling the software from the hardware can extract substantial gains in economy of scale.

Software-defined storage gives administrators the freedom to examine the needs of their business and to hand-select the specific components and software that best support their growth goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company's needs.

3. Future Proofing
Budgets, network environments and corporate priorities all change in response to market demands. Having an expansive, rigid network environment locked into configurations determined by an outside vendor severely curtails the ability of the organization to react nimbly to market demands, much less anticipate them in a proactive manner.

The future in storage is here. There are clear trends pointing to ever-increasing demands for cheap storage, and if companies continue to rely on expensive, inflexible appliances in their data centers, they will be forced to outlay significant funds to develop the storage capacity they need to meet customer demand.

Software-defined solutions offer an attractive alternative to companies looking to "future proof" their data centers. Since the hardware and the software are separate investments, either may be switched out to a better, more appropriate option as the market dictates, at minimal cost.

Software-Defined Storage and Globalization
Software-defined storage can also benefit companies with data centers all over the globe in novel - sometimes unexpected - ways.

Since cloud services need to be accessed from locations all over the world, service providers must be able to offer data centers located across the globe to minimize load time. With global availability, however, come a number of challenges. Load is active in the data center in a company's region. This creates a problem, since all data stored in all locations must be in sync. Also, companies often are required to restrict global data storage from either leaving certain countries (such as Germany), or being stored in others (such as Iran). In addition, global data centers must be resilient to localized disaster - such as a power outage - that puts a local server farm offline. Finally, if a local data center or server goes down, global data centers must reroute data quickly to available servers to minimize downtime.

While there are certainly solutions today that solve these problems, they do so at the application layer. Attempting to solve these issues that high up in the hierarchy of data center infrastructure - instead of solving them at the storage level - presents significant cost and complexity disadvantages. Solving these issues directly at the storage level can reap dividends in efficiency, time and cost savings.

Conclusion
These abilities represent just the beginning. Perceiving the potential of software-defined storage approaches, many organizations are beginning to explore the next phase of data center implementation. For data center administrators facing these types of challenges, a software-defined approach to storage is worth a serious look.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Sometimes I write a blog just to formulate and organize a point of view, and I think it’s time that I pull together the bounty of excellent information about Machine Learning. This is a topic with which business leaders must become comfortable, especially tomorrow’s business leaders (tip for my next semester University of San Francisco business students!). Machine learning is a key capability that will help organizations drive optimization and monetization opportunities, and there have been some...
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor. Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the abili...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...