Welcome!

SDN Journal Authors: Yeshim Deniz, Elizabeth White, Liz McMillan, Pat Romanski, TJ Randall

Related Topics: @CloudExpo, @DXWorldExpo, @ThingsExpo

@CloudExpo: Article

The Disruptor That Needs Disrupting | @CloudExpo #IoT #M2M #Cloud #DigitalTransformation

The Internet has been the biggest societal and economic disruptor over the course of the last 25 years

The Disruptor That Needs Disrupting
By Greg Lord

Many people mistakenly believe that Al Gore invented the Internet, but in reality it was Tim Berners-Lee. He created URIs, HTTP, HTML, and the first web browser - all critical building blocks that paved the way for the Internet to operate as the ubiquitous, decentralized network for sharing information that we take for granted today. As a result of his contributions to society, it was recently announced that Tim Berners-Lee has been awarded the prestigious Turing Award.

Side note: The A.M. Turing Award is the heavyweight of all computer science-related awards, intended to recognize "an individual selected for contributions of a technical nature made to the computing community". This is an understated way of saying "people who have done something really awesome" in the realm of computer science. For anyone who is familiar with Alan Turing's involvement in breaking the German's enigma code during World War II, as shown in the recent Hollywood hit The Imitation Game, you can appreciate the significance of Turing's career accomplishments and the significance of the award that bears his name.

Since Berners-Lee dropped on us the awesomeness that is the Internet in 1989, we've become utterly reliant on this infrastructure to connect us to the services that power our personal and professional lives... social media, streaming video, email, gaming, mobile apps, eCommerce sites, travel sites, enterprise applications, etc. Blasphemy, you say? You're not "utterly reliant" on the Internet? Do you remember the last time AWS had an outage? Every time it happens, everyone freaks out. Have you ever had the Internet go down in your office? Productivity literally grinds to a halt - it's amazing how truly reliant we have become.

The Internet has been the biggest societal and economic disruptor over the course of the last 25 years. It has spawned countless new companies, has transformed and sometimes destroyed existing industries, and has completely changed the way we learn, work, socialize, communicate, and entertain ourselves. And even though there appears to be no end in sight for the innovation that can continue to grow on the shoulders of the Internet, the disruptor that is the Internet needs disrupting in its own right. For as far as we've come with the sophistication and elegance of many of the popular Internet-enabled experiences today, why is it that, by and large, user experience quality is still such an issue?!

There are a myriad of examples of low-quality user experiences that we can all relate to - image cropping and resizing, certificate and cookie warnings, mobile redirects, technology incompatibilities (e.g. flash)... notice that I'm not talking about intentionally suboptimal user experiences, like those riddled with pop-ups, click bait content, videos that auto-play, and single articles split into multiple pages, but rather those user experiences designed by well intentioned teams, that fall short in delivery due a lack of sufficient quality assurance throughout development, deployment, and ongoing operations. Testing might not be the most sexy role out there, but it can be a game changer in terms of ensuring a high quality user experience that delivers on the promise of the given digital experience.

So who or what will serve to disrupt the current paradigm of low-quality Internet experiences? I assert (pun intended) that teams aspiring to deliver Internet-enabled experiences that WOW their users will need to embrace next generation testing, which combines the right tooling with the right processes to reduce time and cost, speed time-to-market, and ensure the highest levels of quality at all levels of the stack and across devices. So here's to Tim Berners-Lee's award-winning creation continuing to evolve and enable even greater levels of innovation, but while doing so, let's hope for disruption in how Internet-enabled experiences are designed and delivered, so as to improve reliability and usability, and usher in the next generation of Internet experiences.

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

CloudEXPO Stories
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the public cloud best suits your organization, and what the future holds for operations and infrastructure engineers in a post-container world. Is a serverless world inevitable?
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-centric compute for the most data-intensive applications. Hyperconverged systems already in place can be revitalized with vendor-agnostic, PCIe-deployed, disaggregated approach to composable, maximizing the value of previous investments.
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. End users now struggle to navigate multiple environments with varying degrees of performance. Companies are unclear on the security of their data and network access. And IT squads are overwhelmed trying to monitor and manage it all.
Machine learning provides predictive models which a business can apply in countless ways to better understand its customers and operations. Since machine learning was first developed with flat, tabular data in mind, it is still not widely understood: when does it make sense to use graph databases and machine learning in combination? This talk tackles the question from two ends: classifying predictive analytics methods and assessing graph database attributes. It also examines the ongoing lifecycle for machine learning in production. From this analysis it builds a framework for seeing where machine learning on a graph can be advantageous.'
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.