|By Michael Bushong||
|April 24, 2014 11:30 AM EDT||
From a cost perspective, the networking dialogue is dominated by CapEx. Acquisition costs for new networking gear have historically been tied to hardware, and despite the relatively recent rise of bare metal switching, networking hardware remains a high-stakes business. But SDN is changing this dynamic in potentially significant ways.
The first point to clarify when talking about CapEx is that CapEx does not necessarily mean hardware (at least not the way that most people mean). While there is a strict financial definition for CapEx, in the networking industry it has become shorthand for Procurement Costs. Because networking solutions have been predominantly monetized through hardware, we associate procurement costs with hardware, but this is changing.
The fact that the ’S’ in SDN stands for software is reason enough for people to look beyond the chassis. But the reality is that while vendors have monetized the hardware, the value has been increasingly moving to the software side for more than a decade. So long as everyone was selling hardware, it didn’t really matter that much whether the cost was tied to the hardware or the software, so we have been a little bit lazy collectively in determining a deliberate pricing mix.
More recently, however, there have been additional solutions that are offered entirely through software. With virtual networking devices, for example, there is no physical hardware (unless you count the servers and the network that connects the servers). A common sales tactic for these types of solutions is to point out how expensive physical solutions are. Why pay for all that sheet metal when you can get the same functionality in a virtual form factor? Of course, you are not really paying for the sheet metal; your check also pays for the software and all the features that go into that sheet metal. But the argument is pretty compelling.
The point here is that the only thing that really matters is how much you pay for the whole solution. Whether the price is affixed to hardware or software is an accounting detail – important for some people, but not really the most important thing for the majority of buyers. Rather than calling it CapEx, we ought to be referring more broadly to procurement or acquisition costs. All in, Solution A costs X dollars to bring in house, and Solution B costs Y dollars.
This would certainly simplify the conversation some. But even then, it isn’t all about procurement costs anymore either.
Depending on the solution, the procurement costs account for roughly one-third of the total cost of ownership. The remaining two-thirds of the cost is ongoing operating expense (power, cooling, space, management, support, and so on). The models here for most solutions start to get pretty squishy. While we can fairly formulaically determine things like power, space, and support, when it comes to estimating the cost of managing a device, the models are so dependent on uncontrollable things that they border on useless. And even when the models are sound, most companies have not sufficiently instrumented their network operations to really know what they are spending.
But just because it is difficult to model OpEx does not mean that network teams should ignore it.
If there is one thing that the gaming industry has taught us, it is that there are all kinds of creative ways to separate someone from their money. In the early days of video games, 100% of the cost was procurement cost. After you bought the install media, you had paid everything you were ever going to pay. Before long, some of the more popular games figured out that they could lower initial costs (make the barrier to entry lower) and then charge for ongoing use through subscriptions.
As the networking world adjusts the pricing mix – associating more of the cost with the software – we should expect that charge models will mirror what we have seen on the consumer side. It is not a big stretch (and in fact already happening) to see massive up-front hardware costs replaced with more palatable hardware pricing combined with either higher software or potentially support costs. This has the dual benefit of making it easier for customers to select a vendor, and creating annuities for said vendor.
But the evolution of game pricing models did not end with subscriptions.
For anyone who has gotten sucked into the hell that is Candy Crush, you are already well aware of in-app purchases. The initial game is free, but if you want to get a special advantage or unlock a level, you can make an in-app purchase. They have cleverly priced the in-app purchases to feel like you are hardly spending anything. It’s less than a dollar. I should just go ahead and get that spotted donut thingy! Of course, by the time you add up all those just a dollar moments, you end up paying far more than you ever would have up front.
The magic of this type of pricing is that most of this is not really known up front. When you first get Candy Crush, you don’t really think you are going to buy the special extras. And Candy Crush doesn’t tell you that the levels get progressively harder to the point that they are nigh impossible without a little extra help.
Before you write this off as not applicable to networking, consider a few points.
First, despite the huge open source push, there are still a lot of companies pursuing commercial grade versions of the otherwise free software. Sure, you might buy into the open source controller, but if you need the networking version of the spotted donut thing, what do you do? This is essentially the networking equivalent of the in-app purchase. Call it the in-arch purchase. Once you buy into a particular architecture, the switching costs are prohibitively high. If you have to pay more for the commercial software, can you really say no?
Second, some of the tiered pricing models that are taking root make it more difficult to accurately model ongoing license costs. If you are not thinking about how the costs will scale with the number of ports, users, VMs, or whatever, you might find out down the road that your solution is contributing more ongoing costs than anticipated. For example, buying one VM from Amazon might seem easy enough, but what if you need thousands? It doesn’t stay cheap forever.
Maybe the in-arch costs are just extra features or capabilities. Or ongoing support and services. Whatever the source, these types of costs contribute to the ongoing operating expenses. And because the primary purchasing criterion is CapEx (procurement costs), burying some of these costs a little later in the product lifecycle and making them a bit smaller in magnitude (but larger in volume) will be attractive.
The punch line here is that we are on the cusp of a change in monetization strategies. You might think that pricing and costs will be transparent, but has the networking community given us a real reason to believe that to date? If you think so, consider this: why do buyers celebrate 50% discounts? It’s because pricing is ridiculously obfuscated in this industry. Until we all start expecting more, I just don’t know why this would change.
Along those lines, my colleague Bill Koss posted some facts about Plexxi costs. In the interest of transparency, it’s worth taking a look here.
[Today’s fun fact: The wettest spot in the world is located on the island of Kauai. Mt. Waialeale consistently records rainfall at the rate of nearly 500 inches per year. That’s enough so drown 7 6-foot-tall men standing on each other’s heads.]
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 26, 2016 10:30 AM EDT Reads: 1,117
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Oct. 26, 2016 10:15 AM EDT Reads: 4,950
In his session at 19th Cloud Expo, Nick Son, Vice President of Cyber Risk & Public Sector at Coalfire, will discuss the latest information on the FedRAMP Program. Topics will cover: FedRAMP Readiness Assessment Report (RAR). This new process is designed to streamline and accelerate the FedRAMP process from the traditional timeline by initially focusing on technical capability instead of documentation preparedness. FedRAMP for High-impact level systems. Early in 2016 FedRAMP officially publishe...
Oct. 26, 2016 10:00 AM EDT Reads: 455
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Oct. 26, 2016 10:00 AM EDT Reads: 1,799
[webinar] Cloud Computing: A Roadmap to Modern Software Delivery | @ImpigerTech #API #Cloud #DataCenter
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Oct. 26, 2016 10:00 AM EDT Reads: 472
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
Oct. 26, 2016 10:00 AM EDT Reads: 2,616
Qosmos, the market leader for IP traffic classification and network intelligence technology, has announced that it will launch the Launch L7 Viewer at CloudExpo | @ThingsExpo Silicon Valley, being held November 1 – 3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The L7 Viewer is a traffic analysis tool that provides complete visibility of all network traffic that crosses a virtualized infrastructure, up to Layer 7. It facilitates and accelerates common IT tasks such as VM migra...
Oct. 26, 2016 10:00 AM EDT Reads: 154
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Oct. 26, 2016 10:00 AM EDT Reads: 2,015
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It doe...
Oct. 26, 2016 09:45 AM EDT Reads: 1,829
In the 21st century, security on the Internet has become one of the most important issues. We hear more and more about cyber-attacks on the websites of large corporations, banks and even small businesses. When online we’re concerned not only for our own safety but also our privacy. We have to know that hackers usually start their preparation by investigating the private information of admins – the habits, interests, visited websites and so on. On the other hand, our own security is in danger bec...
Oct. 26, 2016 09:17 AM EDT Reads: 226
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
Oct. 26, 2016 09:15 AM EDT Reads: 2,051
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 26, 2016 07:30 AM EDT Reads: 11,530
SYS-CON Events announced today that Interface Masters Technologies, a leader in Network Visibility and Uptime Solutions, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Interface Masters Technologies is a leading vendor in the network monitoring and high speed networking markets. Based in the heart of Silicon Valley, Interface Masters' expertise lies in Gigabit, 10 Gigabit and 40 Gigabit Eth...
Oct. 26, 2016 06:45 AM EDT Reads: 3,400
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Oct. 26, 2016 06:15 AM EDT Reads: 2,087
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
Oct. 26, 2016 06:00 AM EDT Reads: 1,892
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 26, 2016 06:00 AM EDT Reads: 2,067
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and microservices. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your conta...
Oct. 26, 2016 06:00 AM EDT Reads: 1,564
If you had a chance to enter on the ground level of the largest e-commerce market in the world – would you? China is the world’s most populated country with the second largest economy and the world’s fastest growing market. It is estimated that by 2018 the Chinese market will be reaching over $30 billion in gaming revenue alone. Admittedly for a foreign company, doing business in China can be challenging. Often changing laws, administrative regulations and the often inscrutable Chinese Interne...
Oct. 26, 2016 06:00 AM EDT Reads: 1,430
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Oct. 26, 2016 05:45 AM EDT Reads: 2,563
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
Oct. 26, 2016 05:30 AM EDT Reads: 1,056