Welcome!

SDN Journal Authors: Elizabeth White, Pat Romanski, TJ Randall, Yeshim Deniz, Liz McMillan

Related Topics: @DevOpsSummit, Containers Expo Blog, @CloudExpo, @DXWorldExpo, SDN Journal

@DevOpsSummit: Article

SDN, SDS and Agility | @CloudExpo #BigData #SDN #DataCenter #Storage

Imagine extending the life of a storage network without a costly and time-consuming rip-and-replace project

IT planning is an imprecise science that allows IT experts to increase the flexibility and agility of IT environments while reducing the bottom line. In an ideal world, where time and budget are not limiting factors, upgrading an organization's infrastructure happens on an ongoing, as-needed basis.

In the real world, IT administrators have to make decisions about the hardware they put in place and how to maintain acceptable service levels over the course of the equipment's expected life. Most businesses do not have the luxury of replacing their current storage systems when IT demands outpace their current infrastructure.

Even with the best IT planning, applications continue to advance and make additional requirements of storage network resources, leaving organizations to face the dilemma of operating at suboptimal levels or replacing their infrastructure at great cost. An advanced SDS solution is needed to handle the current and future demands of data storage, and the security and performance of applications.

Plan with Ease
Imagine extending the life of a storage network without a costly and time-consuming rip-and-replace project. Building onto the existing infrastructure, it would combine any additional devices with the SAN, presenting as a combined storage solution. When IT needs to add new storage media, such as flash for performance, inexpensive hard drives for capacity, or overflow to cloud storage, it seamlessly integrates with existing storage. All systems and applications, both new and old, remain fully available and centrally manageable.

The SDS solution would automatically maintain application Quality of Service levels by monitoring performance, optimizing capacity, and managing data placement and protection. By constantly reviewing storage activity, the software is able to adapt in real-time as demands and workloads change.

This all means IT no longer has to worry about finding the budget to replace their current storage systems - it means they can plan and add what they need, when they need it.

If the desired performance, capacity, or protection policies fall out of alignment, the software will automatically resolve them. In the case of performance, the data will simply and transparently be migrated to faster devices or closer to the workload to reduce network latency. If the volume's capacity is low, it can overflow to the cloud so that admins only need to buy and maintain minimal physical storage. In the case of protection, the software can increase the number of fault domains or replicas for easy backup.

With little-to-no downtime, delays, provisioning, or manual data migration, the SDS solution seamlessly integrates into the underlying infrastructure by placing storage services in front of an existing storage device or system. By optimizing current features and functions of the existing SAN or NAS device, the software provides improved performance needed for the environment, making IT planning easier than ever before.

IT Management Agility
The SDS allows you to specify how big, how fast, and how secure a volume (or workload) should be. This translates to capacity, performance, and protection policies in the software dashboard.

Each volume has an associated Quality of Service (QoS) policy, which describes how it should be managed for storage allocation, data migration, and performance scaling and throttling. The policy is implemented from the point of view of the application, (i.e., the access point to the storage service).

Each volume has an associated pool of eligible storage resources for data in that volume. Automation uses policy information to change the members of the pool as necessary to provide the resources that will help maintain the QoS policy for the volume, or alert if it cannot do so. Most details of a volume can be changed dynamically.

A powerful SDS solution allows users to manage diverse storage systems and resources with one dashboard, reducing the IT knowledge required to make effective use of multiple types of storage devices. Through intelligent automation, the software eliminates manual data migration efforts by identifying, profiling, and utilizing new storage resources across the enterprise. The software achieves near-zero downtime through automation, reduced complexity, and data protection features.

Without manual intervention, data moves among storage resources to maintain QoS levels, adapting in real-time as demands and workloads change. The software becomes aware of new resources and automatically moves appropriate application data to them, continuously monitoring requests, analyzing priority based on performance, latency or bandwidth, and physically moving data to the most appropriate media. To speed up access to business-critical information, the software automatically moves data from slower storage hardware within the existing storage solution to faster-access media volumes.

The next stage of SDS
Data storage of the future will become something that companies can simply rely upon-not something that is costly, time consuming and requires specialized staff to maintain and manage. An SDS solution should provide advanced storage automation that unifies existing storage resources, centralizes storage management, simplifies the deployment of Flash, and improves storage utilization-delivering application specific quality of service levels. If your current SDS solution is not doing all this, then you need to find one that is. The future is waiting.

More Stories By Steven Lamb

Steven Lamb is the CEO & Co-Founder of ioFABRIC. He forged himself as a data storage expert with server-side caching at Nevex Virtual Technologies and now with ioFABRIC, he has a game-changing product in the data storage arena. Steven is a successful serial entrepreneur on his fifth venture, bringing a broad range of strategic positioning, management skills, and leadership experience.

Steven’s first company, Border Network Technologies, became the second largest firewall vendor worldwide. Others included INEX, Nevex Software and the most recent, NEVEX Virtual Technologies, a cache acceleration company.

CloudEXPO Stories
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application performance guarantees & data privacy.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with Tintri's web services architecture and APIs. Impress your DevOps team with smart and autonomous infrastructure.
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace.
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored applications in several industries and discussed technologies that allow the deployment of advanced visualization solutions to the cloud.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker containers gain prominence. He explored these challenges and how to address them, while considering how containers will influence the direction of cloud computing.