Welcome!

SDN Journal Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, TJ Randall

Related Topics: @DevOpsSummit, Containers Expo Blog, @CloudExpo, @DXWorldExpo, SDN Journal

@DevOpsSummit: Article

SDN, SDS and Agility | @CloudExpo #BigData #SDN #DataCenter #Storage

Imagine extending the life of a storage network without a costly and time-consuming rip-and-replace project

IT planning is an imprecise science that allows IT experts to increase the flexibility and agility of IT environments while reducing the bottom line. In an ideal world, where time and budget are not limiting factors, upgrading an organization's infrastructure happens on an ongoing, as-needed basis.

In the real world, IT administrators have to make decisions about the hardware they put in place and how to maintain acceptable service levels over the course of the equipment's expected life. Most businesses do not have the luxury of replacing their current storage systems when IT demands outpace their current infrastructure.

Even with the best IT planning, applications continue to advance and make additional requirements of storage network resources, leaving organizations to face the dilemma of operating at suboptimal levels or replacing their infrastructure at great cost. An advanced SDS solution is needed to handle the current and future demands of data storage, and the security and performance of applications.

Plan with Ease
Imagine extending the life of a storage network without a costly and time-consuming rip-and-replace project. Building onto the existing infrastructure, it would combine any additional devices with the SAN, presenting as a combined storage solution. When IT needs to add new storage media, such as flash for performance, inexpensive hard drives for capacity, or overflow to cloud storage, it seamlessly integrates with existing storage. All systems and applications, both new and old, remain fully available and centrally manageable.

The SDS solution would automatically maintain application Quality of Service levels by monitoring performance, optimizing capacity, and managing data placement and protection. By constantly reviewing storage activity, the software is able to adapt in real-time as demands and workloads change.

This all means IT no longer has to worry about finding the budget to replace their current storage systems - it means they can plan and add what they need, when they need it.

If the desired performance, capacity, or protection policies fall out of alignment, the software will automatically resolve them. In the case of performance, the data will simply and transparently be migrated to faster devices or closer to the workload to reduce network latency. If the volume's capacity is low, it can overflow to the cloud so that admins only need to buy and maintain minimal physical storage. In the case of protection, the software can increase the number of fault domains or replicas for easy backup.

With little-to-no downtime, delays, provisioning, or manual data migration, the SDS solution seamlessly integrates into the underlying infrastructure by placing storage services in front of an existing storage device or system. By optimizing current features and functions of the existing SAN or NAS device, the software provides improved performance needed for the environment, making IT planning easier than ever before.

IT Management Agility
The SDS allows you to specify how big, how fast, and how secure a volume (or workload) should be. This translates to capacity, performance, and protection policies in the software dashboard.

Each volume has an associated Quality of Service (QoS) policy, which describes how it should be managed for storage allocation, data migration, and performance scaling and throttling. The policy is implemented from the point of view of the application, (i.e., the access point to the storage service).

Each volume has an associated pool of eligible storage resources for data in that volume. Automation uses policy information to change the members of the pool as necessary to provide the resources that will help maintain the QoS policy for the volume, or alert if it cannot do so. Most details of a volume can be changed dynamically.

A powerful SDS solution allows users to manage diverse storage systems and resources with one dashboard, reducing the IT knowledge required to make effective use of multiple types of storage devices. Through intelligent automation, the software eliminates manual data migration efforts by identifying, profiling, and utilizing new storage resources across the enterprise. The software achieves near-zero downtime through automation, reduced complexity, and data protection features.

Without manual intervention, data moves among storage resources to maintain QoS levels, adapting in real-time as demands and workloads change. The software becomes aware of new resources and automatically moves appropriate application data to them, continuously monitoring requests, analyzing priority based on performance, latency or bandwidth, and physically moving data to the most appropriate media. To speed up access to business-critical information, the software automatically moves data from slower storage hardware within the existing storage solution to faster-access media volumes.

The next stage of SDS
Data storage of the future will become something that companies can simply rely upon-not something that is costly, time consuming and requires specialized staff to maintain and manage. An SDS solution should provide advanced storage automation that unifies existing storage resources, centralizes storage management, simplifies the deployment of Flash, and improves storage utilization-delivering application specific quality of service levels. If your current SDS solution is not doing all this, then you need to find one that is. The future is waiting.

More Stories By Steven Lamb

Steven Lamb is the CEO & Co-Founder of ioFABRIC. He forged himself as a data storage expert with server-side caching at Nevex Virtual Technologies and now with ioFABRIC, he has a game-changing product in the data storage arena. Steven is a successful serial entrepreneur on his fifth venture, bringing a broad range of strategic positioning, management skills, and leadership experience.

Steven’s first company, Border Network Technologies, became the second largest firewall vendor worldwide. Others included INEX, Nevex Software and the most recent, NEVEX Virtual Technologies, a cache acceleration company.

CloudEXPO Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.