Welcome!

SDN Journal Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, TJ Randall

Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo

SDN Journal: Blog Feed Post

Software Defined Shouldn’t Be About Infrastructure

The focus of a Software Defined strategy should be the applications not the underlying infrastructure

The term "software defined" has taken many forms in recent months from Software Defined Datacenter (SDDC), Software Defined Infrastructure (SDI) to even component vendors adopting the tagline to exalt their own agenda with Software Defined Networking (SDN) and Software Defined Storage (SDS). Yet ironically the majority of the vendors adopting the tagline are also dealing with infrastructure product lines that a "software defined" approach is aiming to make irrelevant.

The emergence of the cloud illuminated to the industry that the procurement, design and deployment of the infrastructure components of network, storage and compute were a hindrance to application delivery. The inability for infrastructure components to not be quickly and successfully coordinated together as well as be automatically responsive to application needs has led many to question why traditional approaches to infrastructure are still being considered. In an attempt to safeguard themselves from this realisation, it's no surprise that the infrastructure vendors have adopted the software defined terminology and consequently marketed themselves as such even though at the end of the day they are still selling what is quintessentially hardware.

From the networking and storage perspective software defined is about abstracting legacy hardware from multiple vendors via virtualization so that the management and configuration is done completely by software. Instead of managing individual components based on their vendor, via APIs these now common pools of network and storage can be quickly and easily managed with automation and orchestration tools. Ironically though this has already existed for some time with examples being HDS' storage virtualization arrays and Nicira's pre-VMware takeover initiatives with OpenFlow, OpenvSwitch and OpenStack. Even the vAppliance concept that is taking on a "software defined" spin has been around for several years. Having the data planes and control of what was a legacy hardware appliance now go via a virtual version is nothing new when looked at in the context of VMware vShield Edge firewalls or NetApp's ONTAP Edge VSA. Looking behind the marketing smokescreen of ease of management & simplification etc. in reality though most if not all of these technologies were invested in and created to do one thing only and that was to take market share away from their competing vendors. By having all your legacy storage arrays or network switches now abstracted and consequently managed and configured by software that is provided by only one of those vendors, the control and future procurement decisions lie firmly in their park. So why do we need to take the software defined approach seriously if at all and what should be our focus if not the infrastructure products that "software defined" seems inherently linked to marketing?

Software defined is incredibly important and vital to IT and the businesses they support because it should bring the focus back on to what matters the most, namely the applications and not the underlying infrastructure.  A true software defined approach to infrastructure that considers the application as its focal point ultimately leads to infrastructure being treated as code where the underlying hardware becomes irrelevant. By configuring all the infrastructure interdependencies as code with an understanding that it needs to support the application and the various environmental transitions it will go through leads to a completely different mindset and approach in terms of subsequent configuration and management of infrastructure. In this case a converged infrastructure approach whereby infrastructure is pre-integrated, pretested and pre-validated from inception as a product ready platform for applications is most suited. Understanding the capabilities of what software defined really is, beyond the hyperbole of infrastructure vendors leads to practices where concepts such as Continuous Delivery, Continuous Deployment and Continuous Integration can take place leading to a radical transformation in the way IT delivers value to the business.

The focus of a Software Defined strategy should be the applications not the underlying infrastructure

So if and when you do face a sales pitch, a new product line or an infrastructure savvy consultant that espouses to you how great and wonderful "software defined" is, there are several things to note and question. Beyond the workings of the infrastructure components how much application awareness and intelligence is there? How will this enable a DevOps approach and a quicker, more reliable and repeatable code deployment that will meet the requirements of the changing demands of your business? How will this also mitigate risk and ensure that applications will not just have their infrastructure resources automated and met but also their consistency in code from development to QA to an eventual live environment?

It is these questions and challenges that a "software defined" approach addresses and solves enabling significant benefits to a business. Once application code changes become reliable, automated and consequently frequent based on an infrastructure that meets the changing demands of its applications, a business can quickly gain a competitive edge over its rivals. Being able to respond quickly to market trends such as ensuring your website can cater for a sudden upsurge of transactions from its mobile version, or countering a sudden commodity price change etc. are all key to gaining a competitive advantage and consequently require an application delivery process that responds to those needs. A "Software Defined" approach can help businesses reach that goal by automating the time consuming,  human error processes linked with IT, as long as they don't lose focus that it's about the applications and not just the infrastructure that supports it.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

CloudEXPO Stories
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust that they are being taken care of.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
Serveless Architectures brings the ability to independently scale, deploy and heal based on workloads and move away from monolithic designs. From the front-end, middle-ware and back-end layers, serverless workloads potentially have a larger security risk surface due to the many moving pieces. This talk will focus on key areas to consider for securing end to end, from dev to prod. We will discuss patterns for end to end TLS, session management, scaling to absorb attacks and mitigation techniques.
Crosscode Panoptics Automated Enterprise Architecture Software. Application Discovery and Dependency Mapping. Automatically generate a powerful enterprise-wide map of your organization's IT assets down to the code level. Enterprise Impact Assessment. Automatically analyze the impact, to every asset in the enterprise down to the code level. Automated IT Governance Software. Create rules and alerts based on code level insights, including security issues, to automate governance. Enterprise Audit Trail. Auditors can independently identify all changes made to the environment.