|By Mat Mathews||
|September 3, 2014 06:00 AM EDT||
A good friend and business colleague once regaled me with his definition of a good corporate lawyer: “A good lawyer never says ‘no’; she says ‘here’s how’.” I thought this was an interesting and telling description – not because it conjured up creative interpretations of the law and loop-hole sleuthing corporate counsels – but that it imagined a seasoned practioner who understood the plasticity of her infrastructure (in this case the law) and the end goals of her client and therefore would often find innovative solutions that yielded business advantage. Plasticity in this context means that a seemingly rigid structure, like the law, can be deformed to meet a new need. Examples of this range from the mundane structuring of contracts to limit the downside of risky deals to the industry redefining methods of companies like Uber that challenge conventional practices and laws.
The law and the network – both meant to be broken?
A similar notion can be applied to networking infrastructure. It is often repeated that networking infrastructure is ‘rigid’ and ‘complex’. Other than just being evocative marketing terms, these words signify a level of resistance to adaption. Marketeering aside, business leaders are in fact expressing that what they want or need to do cannot be done – either feasibly, in a timely manner, or with an appropriate risk profile due to infrastructure obstacles. Every time connectivity needs change (think mainframe networks to multi-protocol client/server networks to IP routers/switches to remote access VPNs to high density data center switches, etc.) a new set of technologies, platforms, protocols, and ultimately infrastructure is put in place. For many years this may have been ok, and possibly even expected. Yet, for probably the past decade, the increasing pace of change of business needs and the continuous uncertainty of competitive environments have forced businesses to push harder on the aspects of their organization that prevent rapid change, that don’t exhibit plasticity.
SDN, the movement
Enter networking infrastructure, and more specifically SDN. While the canonical definition of SDN is accepted to be something about a decoupled control plane, there is also the notion of SDN the movement. This notion of SDN bears not an architectural definition, but rather embodies a user-led reaction to this lack of plasticity in their infrastructure. How are network engineers expected to say “here’s how” when their infrastructure requires generational shifts or years of standardization to catch up to yesterday’s demands? In many ways, SDN is nothing more than the desire of users to bring a level of adaptability to the uncertainty and change they experience in their business to the infrastructure.
Haven’t we heard this before?
Network plasticity is most likely not a new idea (either that or it’s naïve and unachievable.) Many a marketer has talked about the coming age of infrastructure that is fluid, dynamic, software-defined, change-ready, yada yada yada. Yet most of what is described by this fluid, dynamic, software-defined infrastructure is generally related to the shrinking, scaling, or movement of physical resources to match a desired processing need, ultimately to meet a utility cost objective. What network plasticity is about, however, is a more fundamental notion that connectivity needs will change ahead of generational or architecture product lifetimes, and that the answer cannot be to put the business needs on hold until the products catch up. What plasticity affords is a fundamental deformation of the primary design use-case into one that was a priori unforeseen – a set of carefully planned escape valves that prevent their operators from having to say, “no, we cannot.”
Will current networks bend and snap?
Many networks conceived for the world of client-server computing are being tested and stretched for the needs of highly distributed, edge-processing, no-central data store, scale-out applications. While the industry attempts to move the architectural needle forward with new encapsulations to remove restrictions of L3 boundaries, bigger buffers to accommodate the predominance of server-to-server flows, new chipsets, new interface technology, better Ethernet storage traffic handling, etc, it is still not assessing the fundamental desire to prevent the need for this catch-up game in the first place. At some point, applications will decide for themselves how they would like their various components to be connected and will be able to express policy, SLA, risk profiles, and other constraints and objectives that ultimately translate into a set of network behaviors and topologies. Will the underlying infrastructure be capable of handling the resulting permutations of requirements without deriving an exhaustive and limited set of supported behaviors? Will it be able to grow with the increasingly sophisticated demands of these applications to achieve what may previously have been thought to be unfeasible? This largely depends on how we as an industry approach plasticity as an inherent infrastructure trait – perhaps the only one that really matters anymore.
[Today’s fun fact: GE's Living Environment Concept House, aka the "Plastic House" in Pittsfield, MA (pictured here) was built using 45,000 lbs of various plastics throughout much of the construction including the roof, windows, siding, plumbing, foundation, electrical, and mechanical systems. I'm guessing its not BPA free.]
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Dec. 9, 2016 08:00 AM EST Reads: 974
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 9, 2016 07:45 AM EST Reads: 1,191
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
Dec. 9, 2016 07:00 AM EST Reads: 4,348
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2017 New York The 7th Internet of @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, New York. Chris Matthieu is the co-founder and CTO of Octoblu, a revolutionary real-time IoT platform recently acquired by Citrix. Octoblu connects things, systems, people and clouds to a global mesh network allowing users to automate and control design flo...
Dec. 9, 2016 06:00 AM EST Reads: 823
You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things. In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves, CTO of Datical, explained how DevOps can be your onramp to the cloud. By adopting simple, platform independent DevOps strategies, you can accelerate your move to the cloud. Spoiler Alert: He also makes sure you don’t...
Dec. 9, 2016 05:00 AM EST Reads: 3,141
President Obama recently announced the launch of a new national awareness campaign to "encourage more Americans to move beyond passwords – adding an extra layer of security like a fingerprint or codes sent to your cellphone." The shift from single passwords to multi-factor authentication couldn’t be timelier or more strategic. This session will focus on why passwords alone are no longer effective, and why the time to act is now. In his session at 19th Cloud Expo, Chris Webber, security strateg...
Dec. 9, 2016 04:45 AM EST Reads: 504
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 9, 2016 04:30 AM EST Reads: 1,035
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 18th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, shared the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Dec. 9, 2016 04:15 AM EST Reads: 3,558
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
Dec. 9, 2016 04:15 AM EST Reads: 1,457
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Dec. 9, 2016 04:00 AM EST Reads: 592
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Dec. 9, 2016 04:00 AM EST Reads: 6,346
"We are a custom software development, engineering firm. We specialize in cloud applications from helping customers that have on-premise applications migrating to the cloud, to helping customers design brand new apps in the cloud. And we specialize in mobile apps," explained Peter Di Stefano, Vice President of Marketing at Impiger Technologies, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 9, 2016 03:30 AM EST Reads: 502
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Dec. 9, 2016 03:00 AM EST Reads: 425
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Dec. 9, 2016 02:15 AM EST Reads: 6,237
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
Dec. 9, 2016 02:00 AM EST Reads: 3,074
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 9, 2016 01:45 AM EST Reads: 1,000
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 9, 2016 01:45 AM EST Reads: 1,958
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 9, 2016 01:15 AM EST Reads: 3,981
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dec. 9, 2016 01:15 AM EST Reads: 1,670
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
Dec. 9, 2016 12:45 AM EST Reads: 5,131