Welcome!

SDN Journal Authors: Liz McMillan, Stefan Bernbo, Michel Courtoy, Amitabh Sinha, Mike Wood

Related Topics: SDN Journal, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, @BigDataExpo

SDN Journal: Article

Future Proofing the Data Center

How to become agile in a rapidly changing storage world

More and more of our lives are lived online. Our music collections, bookshelves, vacation memories and more are increasingly digitized and uploaded into the cloud, the vast network of server farms that provide the bulk of online storage today. Research firm Gartner projects that by 2016, 36 percent of consumer content will be stored on the cloud, up from a mere seven percent in 2011.

Service providers, watching these trends with a wary eye, will be required to accommodate ever-increasing demands for storage as consumer appetites for cloud content storage continues to grow. To adapt, many service providers are exploring new options in data center architecture that will permit greater flexibility and control over hardware costs.

One such option is software-defined storage. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates the dependency on server "appliances" with software inextricably baked in.

Software-Defined Storage: A Primer
Although the term "software-defined" may seem like a recent buzzword, many everyday electronic devices - such as personal computers - have been "software-defined" for years. In the example of a PC, software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. For example, the average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand - whether towards a high-powered graphic design setup, for example, or a lightweight web browser.

Despite these clear benefits in flexibility, data centers are one of the last frontiers for software-defined technologies. The reluctance to embrace the trend can be traced - as it so often is - to the initial expenditures required to make the switch. Given the sheer level of infrastructure in service providers' data centers - giant warehouses in multiple locations across the globe - the outlays required to switch systems represents a very high investment indeed. Yet there is no denying the consumer trend toward ever-higher rates of online content storage.

The Status Quo
Existing data center architecture is comprised mainly of appliances. In industry parlance, an appliance is server hardware with proprietary, mandatory software baked in. The software is designed for the hardware and vice versa, and come tightly wedded together as a package. This can be a benefit for data centers without staff who specialize in server technology and therefore lack the tools necessary to configure a custom server deployment in-house. Yet since hardware inevitably fails (at a number of points within the machine), traditional appliances typically include multiple copies of expensive components to anticipate and prevent failure. These extra layers of identical hardware extract higher costs in energy usage, and add layers of complication to a single appliance. Because the actual cost per appliance is quite high compared with commodity servers, cost estimates often skyrocket when companies begin examining how to scale out their data centers.

It's due in large part to problems with appliances that data center administrators are beginning to consider software-defined storage approaches. "Software-defined" is far from a new concept. To the data center ecosystem, however - accustomed to appliances with mandatory, pre-set software - a software-defined approach was almost revolutionary.

Data center administrators interested in learning more about software-defined solutions are typically attracted initially to one or more of the following benefits:

1. Cost Reduction
As with many solutions, convenience comes with a price. Traditional appliances offer convenience in the form of a baked-in, standardized package, but the added costs associated with several layers of complicated, mandatory software coupled with high-powered hardware can lead to significant cost outlays for a data center that needs to scale rapidly.

Conversely, software-defined storage liberates the software from the hardware, allowing administrators to choose inexpensive commodity servers. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users' growing demand for storage.

2. Flexibility
Not every data center is created equal. A telco servicing one particular area will have different storage needs than a major bank with branches in several countries, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, the benefits realized by fully uncoupling the software from the hardware can extract substantial gains in economy of scale.

Software-defined storage gives administrators the freedom to examine the needs of their business and to hand-select the specific components and software that best support their growth goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company's needs.

3. Future Proofing
Budgets, network environments and corporate priorities all change in response to market demands. Having an expansive, rigid network environment locked into configurations determined by an outside vendor severely curtails the ability of the organization to react nimbly to market demands, much less anticipate them in a proactive manner.

The future in storage is here. There are clear trends pointing to ever-increasing demands for cheap storage, and if companies continue to rely on expensive, inflexible appliances in their data centers, they will be forced to outlay significant funds to develop the storage capacity they need to meet customer demand.

Software-defined solutions offer an attractive alternative to companies looking to "future proof" their data centers. Since the hardware and the software are separate investments, either may be switched out to a better, more appropriate option as the market dictates, at minimal cost.

Software-Defined Storage and Globalization
Software-defined storage can also benefit companies with data centers all over the globe in novel - sometimes unexpected - ways.

Since cloud services need to be accessed from locations all over the world, service providers must be able to offer data centers located across the globe to minimize load time. With global availability, however, come a number of challenges. Load is active in the data center in a company's region. This creates a problem, since all data stored in all locations must be in sync. Also, companies often are required to restrict global data storage from either leaving certain countries (such as Germany), or being stored in others (such as Iran). In addition, global data centers must be resilient to localized disaster - such as a power outage - that puts a local server farm offline. Finally, if a local data center or server goes down, global data centers must reroute data quickly to available servers to minimize downtime.

While there are certainly solutions today that solve these problems, they do so at the application layer. Attempting to solve these issues that high up in the hierarchy of data center infrastructure - instead of solving them at the storage level - presents significant cost and complexity disadvantages. Solving these issues directly at the storage level can reap dividends in efficiency, time and cost savings.

Conclusion
These abilities represent just the beginning. Perceiving the potential of software-defined storage approaches, many organizations are beginning to explore the next phase of data center implementation. For data center administrators facing these types of challenges, a software-defined approach to storage is worth a serious look.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
As more and more companies are making the shift from on-premises to public cloud, the standard approach to DevOps is evolving. From encryption, compliance and regulations like GDPR, security in the cloud has become a hot topic. Many DevOps-focused companies have hired dedicated staff to fulfill these requirements, often creating further siloes, complexity and cost. This session aims to highlight existing DevOps cultural approaches, tooling and how security can be wrapped in every facet of the bu...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, S...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...