Welcome!

SDN Journal Authors: Elizabeth White, Johnnie Konstantas, Cloud Best Practices Network, Sarah Patrick, Ramesh Ganapathy

Related Topics: @CloudExpo, Java IoT, Open Source Cloud, Containers Expo Blog, @BigDataExpo, SDN Journal

@CloudExpo: Article

VCE: Driving the Velocity of Change in Cloud Computing

VCE's new specialized systems are key to de-risking mission critical application deployments

When you think Cloud, whether Private or Public, one of the key advantages that comes to mind is speed of deployment. All businesses crave the ability to simply go to a service portal, define their infrastructure requirements and immediately have a platform ready for their new application. Coupled with that you instantly have service level agreements that generally center on uptime and availability. So for example, instead of being a law firm that spends most of its budget on an in house IT department and datacenter, the Cloud provides an unavoidable opportunity for businesses to instead procure infrastructure as a service and consequently focus on delivering their key applications. But while the understanding of Cloud Computing and its benefits have matured within the industry, so too has the understanding that maybe what's currently being offered still isn't good enough for their mission critical applications. The reality is that there is still a need for a more focused and refined understanding of what the service level agreements should be and ultimately a more concerted approach towards the applications. So while neologisms such as speed, agility and flexibility remain synonymous with Cloud Computing, its success and maturity ultimately depend upon a new focal point, namely velocity.

Velocity bears a distinction from speed in that it's not just a measure of how fast an object travels but also in what direction that object moves. For example in a Public Cloud whether that be Amazon, Azure or Google no one can dispute the speed. Through only the clicks of a button you have a ready-made server that can immediately be used for testing and development purposes. But while it may be quick to deploy, how optimised is it for your particular environment, business or application requirements? With only generic forms the specific customization to a particular workload or business requirement fails to be achieved as optimization is sacrificed for the sake of speed. Service levels based on uptime and availability are not an adequate measure or guarantee for the successful deployment of an application. For example it would be considered ludicrous to purchase a laptop from a service provider that merely stipulates a guarantee that it will remain powered on even though it performs atrociously.

In the Private Cloud or traditional IT example, while the speed to deployment is not as quick as that of a public cloud, there are other scenarios where speed is being witnessed yet failing to produce the results required for a maturing Cloud market. Multiple infrastructure silos will constantly be seen to be hurrying around, busily firefighting and maintaining "the keeping the lights on culture" all at rapid speed. Yet while the focus should be on the applications that need to be delivered, being caught in the quagmire of the underlying infrastructure persistently takes precedent with IT admin having to constantly deal with interoperability issues, firmware upgrades, patches and multi-management panes of numerous components. Moreover service offerings such as Gold, Silver, Bronze or Platinum are more often than not centered around infrastructure metrics such as number of vCPUs, Storage RAID type, Memory etc. instead of application response times that are predictable and scalable to the end user's stipulated demands.

For Cloud to embrace the concept of velocity the consequence would be a focused and rigorous approach that has a direction aimed solely at the successful deployment of applications that in turn enable the business to quickly generate revenue. All the pieces of the jigsaw that go into attaining that quick and focused approach would require a mentality of velocity being adopted comprehensively from each silo of the infrastructure team while concurrently working in cohesion with the application team to deliver value to the business. This approach would also entail a focused methodology to application optimization and consequently a service level that measured and targeted its success based on application performance as opposed to just uptime and availability.

While some Cloud and service providers may claim that they already work in unison with a focus on applications, it is rarely the case behind the scenes as they too are caught in the challenge of traditional build it yourself IT. Indeed it's well known that some Cloud hosting providers are duping their end users with pseudo service portals where only the impression of an automated procedure for deploying their infrastructure is actually provided. Instead service portals that actually only populate a PDF of the requirements which are then printed out and sent to an offshore admin who in turn provisions the VM as quickly as possible are much closer to the truth. Additionally it's more than likely that your Private Cloud or service provider has a multi-tenant infrastructure with mixed workloads that sits behind the scenes as logical pools ready to be carved up for your future requirements. While this works for the majority of workloads and SMB applications, with more businesses looking to place more critical and demanding applications into their Private Cloud to attain the benefits of chargeback etc. they need an assurance of an application response time that is almost impossible to guarantee on a mixed workload infrastructure. As the Cloud market matures and the expectations that come with it with regards to application delivery and performance, such procedures and practices will only be suitable for certain markets and workloads.

So for velocity to take precedent within the Private Cloud, Cloud or even Infrastructure as a Service model and to fill this Cloud maturity void, infrastructure needs to be delivered with applications as their focal point. That consequently means a pre-integrated, pre-validated, pre-installed and application certified appliance that is standardized as a product and optimised to meet scalable demands and performance requirements. This is why the industry will soon start to see a new emergence of specialized systems specifically designed and built from inception for performance optimization of specific application workloads. By having applications pre-installed, certified and configured with both the application and infrastructure vendors working in cohesion, the ability for Private Cloud or service providers to predict, meet and propose application performance based service levels becomes a lot more feasible. Additionally such an approach would also be ideal for end users who just need a critical application rolled out immediately in house with minimum fuss and risk.

While there may be a number of such appliances or specialized systems that will emerge in the market for applications such as SAP HANA or Cisco Unified Communications the key is to ensure that they're standardized as well as optimised. This entails a converged infrastructure that rolls out as a single product and consequently has a single matrix upgrade for all of its component patches and firmware upgrades that subsequently also correspond with the application. Additionally it encompasses a single support model that includes not only the infrastructure but also the application. This in turn not only eliminates vendor finger pointing and prolonged troubleshooting but also acts as an assurance that responsibility of the application's performance is paramount regardless of the potential cause of the problem.

The demand for key applications to be monitored, optimised and rolled out with speed and velocity will be faced by not only Service providers and Private Cloud deployments but also internal IT departments who are struggling with their day to day firefighting exercises. To ensure success, IT admin will need a new breed of infrastructure or specialized systems that enables them to focus on delivering, optimizing and managing the application and consequently not needing to worry about the infrastructure that supports them. This is where the new Vblock specialized systems being offered by VCE come into play. Unlike other companies with huge portfolios of products, VCE have a single focal point, namely Vblocks. By now adopting that same approach of velocity that was instilled for the production of standardized Vblock models, end users can now reap the same rewards with new specialized systems that are application specific. Herein lies the key to Cloud maturity and ultimately the successful deployment of mission critical applications.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

@CloudExpo Stories
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
As someone who has been dedicated to automation and Application Release Automation (ARA) technology for almost six years now, one of the most common questions I get asked regards Platform-as-a-Service (PaaS). Specifically, people want to know whether release automation is still needed when a PaaS is in place, and why. Isn't that what a PaaS provides? A solution to the deployment and runtime challenges of an application? Why would anyone using a PaaS then need an automation engine with workflow ...
Recognizing the need to identify and validate information security professionals’ competency in securing cloud services, the two leading membership organizations focused on cloud and information security, the Cloud Security Alliance (CSA) and (ISC)^2, joined together to develop an international cloud security credential that reflects the most current and comprehensive best practices for securing and optimizing cloud computing environments.
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
SYS-CON Events announced today that iDevices®, the preeminent brand in the connected home industry, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. iDevices, the preeminent brand in the connected home industry, has a growing line of HomeKit-enabled products available at the largest retailers worldwide. Through the “Designed with iDevices” co-development program and its custom-built IoT Cloud Infrastruc...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what do...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...