Welcome!

SDN Journal Authors: Stefan Bernbo, Michel Courtoy, Amitabh Sinha, Mike Wood, Liz McMillan

Related Topics: SDN Journal, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo

SDN Journal: Blog Feed Post

Software Defined Shouldn’t Be About Infrastructure

The focus of a Software Defined strategy should be the applications not the underlying infrastructure

The term "software defined" has taken many forms in recent months from Software Defined Datacenter (SDDC), Software Defined Infrastructure (SDI) to even component vendors adopting the tagline to exalt their own agenda with Software Defined Networking (SDN) and Software Defined Storage (SDS). Yet ironically the majority of the vendors adopting the tagline are also dealing with infrastructure product lines that a "software defined" approach is aiming to make irrelevant.

The emergence of the cloud illuminated to the industry that the procurement, design and deployment of the infrastructure components of network, storage and compute were a hindrance to application delivery. The inability for infrastructure components to not be quickly and successfully coordinated together as well as be automatically responsive to application needs has led many to question why traditional approaches to infrastructure are still being considered. In an attempt to safeguard themselves from this realisation, it's no surprise that the infrastructure vendors have adopted the software defined terminology and consequently marketed themselves as such even though at the end of the day they are still selling what is quintessentially hardware.

From the networking and storage perspective software defined is about abstracting legacy hardware from multiple vendors via virtualization so that the management and configuration is done completely by software. Instead of managing individual components based on their vendor, via APIs these now common pools of network and storage can be quickly and easily managed with automation and orchestration tools. Ironically though this has already existed for some time with examples being HDS' storage virtualization arrays and Nicira's pre-VMware takeover initiatives with OpenFlow, OpenvSwitch and OpenStack. Even the vAppliance concept that is taking on a "software defined" spin has been around for several years. Having the data planes and control of what was a legacy hardware appliance now go via a virtual version is nothing new when looked at in the context of VMware vShield Edge firewalls or NetApp's ONTAP Edge VSA. Looking behind the marketing smokescreen of ease of management & simplification etc. in reality though most if not all of these technologies were invested in and created to do one thing only and that was to take market share away from their competing vendors. By having all your legacy storage arrays or network switches now abstracted and consequently managed and configured by software that is provided by only one of those vendors, the control and future procurement decisions lie firmly in their park. So why do we need to take the software defined approach seriously if at all and what should be our focus if not the infrastructure products that "software defined" seems inherently linked to marketing?

Software defined is incredibly important and vital to IT and the businesses they support because it should bring the focus back on to what matters the most, namely the applications and not the underlying infrastructure.  A true software defined approach to infrastructure that considers the application as its focal point ultimately leads to infrastructure being treated as code where the underlying hardware becomes irrelevant. By configuring all the infrastructure interdependencies as code with an understanding that it needs to support the application and the various environmental transitions it will go through leads to a completely different mindset and approach in terms of subsequent configuration and management of infrastructure. In this case a converged infrastructure approach whereby infrastructure is pre-integrated, pretested and pre-validated from inception as a product ready platform for applications is most suited. Understanding the capabilities of what software defined really is, beyond the hyperbole of infrastructure vendors leads to practices where concepts such as Continuous Delivery, Continuous Deployment and Continuous Integration can take place leading to a radical transformation in the way IT delivers value to the business.

The focus of a Software Defined strategy should be the applications not the underlying infrastructure

So if and when you do face a sales pitch, a new product line or an infrastructure savvy consultant that espouses to you how great and wonderful "software defined" is, there are several things to note and question. Beyond the workings of the infrastructure components how much application awareness and intelligence is there? How will this enable a DevOps approach and a quicker, more reliable and repeatable code deployment that will meet the requirements of the changing demands of your business? How will this also mitigate risk and ensure that applications will not just have their infrastructure resources automated and met but also their consistency in code from development to QA to an eventual live environment?

It is these questions and challenges that a "software defined" approach addresses and solves enabling significant benefits to a business. Once application code changes become reliable, automated and consequently frequent based on an infrastructure that meets the changing demands of its applications, a business can quickly gain a competitive edge over its rivals. Being able to respond quickly to market trends such as ensuring your website can cater for a sudden upsurge of transactions from its mobile version, or countering a sudden commodity price change etc. are all key to gaining a competitive advantage and consequently require an application delivery process that responds to those needs. A "Software Defined" approach can help businesses reach that goal by automating the time consuming,  human error processes linked with IT, as long as they don't lose focus that it's about the applications and not just the infrastructure that supports it.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

@CloudExpo Stories
SYS-CON Events announced today that TechTarget has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TechTarget storage websites are the best online information resource for news, tips and expert advice for the storage, backup and disaster recovery markets.
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organi...
Managing mission-critical SAP systems and landscapes has never been easy. Add public cloud with its myriad of powerful cloud native services and this may not change any time soon. Public cloud offers exciting new possibilities for enterprise workloads. But to make use of these possibilities and capabilities, IT teams need to re-think everything they have done before. Otherwise, they will just end up using public cloud as a hosting platform for their workloads, aka known as “lift and shift.”
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists looked at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deliver...
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 21st Int\ernational Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...