Welcome!

SDN Journal Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan, TJ Randall

Related Topics: SDN Journal, Industrial IoT, Linux Containers, Containers Expo Blog, @CloudExpo

SDN Journal: Article

Who Will Be Storage Leaders of 2016? | @CloudExpo #Cloud #SDN #Linux

The accelerated pace of moving data to the cloud suggests that hybrid cloud an hyper-scale cloud providers will make share gains

A year ago, I wrote a two-part series on how lower cost, higher performance on-premise storage and nearly free cloud-based storage were driving both innovation and disruption in the storage industry.  Applying Clayton Christensen's theory of innovation and disruption to the storage industry, my premise was that flashy startups, (e.g., Pure, Nimble, VMem, Tegile and Tintri) that were first to introduce credible data reduction to flash arrays were innovators but not disruptors in the space and would therefore disappoint investors; storage incumbents (e.g., HDS, EMC and IBM) who added data reduction would continue to survive; but the real disruptors would be the cloud players (e.g., Amazon, Google and Microsoft). The combination of struggling share prices, weak earnings reports, recent acquisitions, and raging cloud revenue witnessed throughout 2015 and into 2016 continue to point to Christensen's theory as the explanation for an ongoing economic transformation that will forever change the storage industry as we know it.

As we look at the future of storage: 2016 and beyond, what does it hold for the industry? Today, it's apparent that the last of the flashy startups and remaining storage incumbents are fully engaged in a race to solve what is essentially yesterday's problem. They are building better, faster and cheaper on premise branded arrays... that will serve a declining share of the market. Dependence upon branded solutions will prove to be the Achilles heel for the once innovative array companies. Meanwhile, disruption continues to be driven by cloud adoption. A recent StorageNewsletter article cited an IDC report that suggests cloud environments accounted for 1/3 of worldwide IT spending on infrastructure in 2015 and predicted that overall spending on enterprise IT infrastructure would jump from 28% to 32.5% over the next year.

The accelerated pace of moving data to the cloud suggests that hybrid cloud and hyper-scale cloud providers will make massive share gains. I believe the impact of these gains will drive the adoption of white box compute, open source storage, computing and networking software. Offerings that are "good enough" today, will progress and the software driven, cloud business model will prevail.

Hybrid Cloud Changes Everything
Is it premature to say that the branded storage companies and business are in a permanent state of decline and that hybrid cloud changes everything?  I don't think so.  Here's why:

Business Model Change
Moving data to the cloud changes compute, networking and storage costs from CAPEX to OPEX items. Storage, networking, and compute are all built from unbranded commodity components,which makes performance metrics and costs of goods easy to compare.  Differentiation and margins decline as a direct result of this commoditization. Marketing of branded solutions loses the power to influence decision-making.  And, as interoperability between vendors improves, enterprises can gain agility with the ability to switch vendors fluidly.

Buyer Influence Change
Data center buyers emerge and gain influence in relation to storage (specific) admins and buyers.  As purchasing relationships change, branded vendors lose influence. This accelerates movement from branded to commodity-based hardware. As data center buyers consolidate compute, networking and storage purchasing into one entity to drive down costs and margin, their buying power becomes more concentrated. White box vendors operate at 10% margin vs. the 55% margin of branded vendors, so incumbents will have a hard time ever competing in this environment.

Emergence of White Box
The hardware platform of choice becomes standards-based, commodity class white boxes for compute, networking and storage.  Branded vendor proprietary reliability is instead delivered through commodity component redundancy; "Refresh" becomes a passe term and is constant to improve efficiency.  Flash devices enable high performance, negating the requirement for specialized platforms.

Open Source Software with Enterprise Support Is Embraced
Standardization drives migration of value to open source software. As the offerings pass the "good enough" quality and performance bar, maintenance and support dollars move from hardware to open source-based options for software- defined compute, networking and storage.  Because more mission critical applications are migrated to the hybrid cloud, and software is disaggregated from hardware, cloud data centers opt for commercial support to deliver an enterprise level of QoS to their cloud infrastructure.

Emergence of Hyper-Converged as an Architecture Option
The hybrid cloud can leverage the hyper-converged architectures supported by more capable, and feature rich open source software.  Distributed storage becomes a function of commodity servers, not special purpose storage arrays.

Conclusion: Which Cloud Players Will Ride Disruption to Leadership in 2016?
So, who will the real disruptors be in the coming months? My prediction is that Christiansen's theory of Innovation and Disruption plays out to a "T" and that we'll see three classes of companies benefit from the secular trend and disruption of hybrid cloud adoption. If I'm right, all three will benefit at the expense of branded incumbents.

  1. White box players and ODMs (e.g., Super Micro, Quanta, Wistron, Inventec). This class of competitor will gain share by selling directly to all hybrid clouds - both hyper scale providers and to a larger group of enterprises that build clouds. To establish leadership in this category, white box players will have to overcome the pressures of highly concentrated buyers, tight margins and codified pricing. These are big challenges, but they are ones their margin structures are prepared to address.
  2. Hyper scale cloud service providers - By 2020, 50% of storage dollars will be controlled by Amazon, MSFT, IBM, and Google. These companies will leverage core competencies like applications and analytics to subsidize free or near free storage, which will accelerate the migration of critical apps and data to the hybrid cloud and quicken market share gains.
  3. Commercial Open Source software vendors (Red Hat, SUSE) - These vendors deliver WW support for enterprise compute, network and storage software.  Hybrid cloud data centers already leverage Linux at a higher rate than any other OS and because it houses mission critical information, operators will look for single-source support. The company that provides 75% of paid Linux is Red Hat.  The halo effect of Linux adoption benefits other open source ecosystems, such as Open Stack, Ceph, KVM, and Containers.

The hybrid cloud is disrupting the supply chain of the IT infrastructure. Just as Clayton Christiansen's theory of innovation and disruption predicts, new business models and products offered by hybrid players are disrupting old models, changing the rules for how data is stored, and winning the battle for storage market share.  In this case, it is not just one sector that benefits, but the whole ecosystem of hardware vendors, open source support organizations and hyper scale providers that form a cohesive, well capitalized, capable, and disruptive market force. In the long run, hybrid cloud disruption delivers low cost, scale and agility, all things that benefit the enterprise and the end user.

More Stories By Tom Cook

Tom Cook, Chief Executive Officer and President, is responsible for guiding the company’s strategy and vision. Cook has more than 20 years of experience leading growth stage technology companies. Prior to joining Permabit, Cook led and completed the sale of web application developer, Curl Corporation to a division of the Sumitomo Corporation. Prior to Curl, Cook was President and CEO of audio tool maker, Cakewalk Software (acquired by Roland Corporation), which he led from start-up to worldwide market leadership. He also served as a director of or advisor to more than a dozen companies and organizations.

Cook has a Master’s degree in Business Administration from the Amos Tuck School of Business Administration, Dartmouth College, and a Bachelor’s degree in Economics from Harvard University.

@CloudExpo Stories
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. ...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
@DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises - and delivering real results.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...