|By Shashi Mysore||
|June 17, 2014 12:00 PM EDT||
The private cloud: What it is (and isn't)
The private cloud is misunderstood. At this stage, vendors can be forgiven for not having a firm idea of what the private cloud is and how it can uniquely improve their processes. Too many service providers have branded legacy technologies as "cloud-based," warping buyers' expectations about what a private cloud actually is and should do. In some cases, what developers and testers get with these cloud-washed solutions are nothing more than heavily virtualized environments that offer little or none of the scalability, automation, self-service, and on-demand provisioning associated with cloud computing at-large.
As a result, organizations may think of the private cloud as simply a slight update to their internal IT systems, a fresh coat of paint on the same aging infrastructure that they have been looking to leave behind as new dynamic applications become more central to their businesses. They want a cloud platform that can enable more agile software lifecycles, with streamlined testing and abundant resources for developers, but these misconceptions may lead them to conclude that the public cloud is the only way to achieve these goals and facilitate quicker time-to-market, as well as higher programmer productivity.
The truth is that using a private cloud not only provides a productive dev/test environment, but also adds tangible advantages over running all workloads on public infrastructure. For starters, the unpredictable opex of using public cloud services can be replaced with fixed capex, mostly for high-end servers and appliances that provide the dedicated power for running tasks consistently. Accordingly, dev/test teams benefit from working in a single-tenant environment in which resources are rapidly and reliably provisioned and often come from appliances with superior specifications to public Infrastructure-as-a-Service machines.
Moreover, the private cloud boosts particular IT workflows, all while shielding these operations behind the company firewall. It provides high levels of security and control that make it ideal for safely developing and testing applications. Naturally, dev/test has been instrumental in shaping the private cloud's amenities for automation and self-service, along with the evolution of commercial solutions that leverage open source software to give teams maximum flexibility in configuring and customizing processes.
Some industry executives and observers have disputed the entire notion of the private cloud, arguing that it either has no agreed-upon definition or is already obsolete, but in reality its structures are readily discernible in data centers and on-premises systems that depend on it for these central IT, dev/test and quality assurance operations. These workflows all offer low-risk/high-reward opportunities for synthesizing the productive gains of cloud computing with dedicated hardware to obtain the best of both worlds.
While the private cloud has sometimes been cited for lacking the fundamental traits of cloud computing in general - on-demand service, scalability, self-provisioning and measurement - this argument only applies to virtualized environments that have been wrongly labeled. These depictions should not obscure the fact that private clouds are real, vital parts of IT infrastructure.
Why Do Testing and Development in a Private Cloud?
If development, and deployment of software, is a strategic everyday activity, then in most cases private clouds make sense. There is a common perception that controlling your IT destiny is complicated and expensive. But today, with highly evolved hardware and highly automated cloud platforms, operation of your own infrastructure is becoming easier and less expensive by the day.
Development and testing is a natural fit for cloud environments because superior access to resources leads to shorter wait times during critical processes (no more never-ending provisioning) and lower costs per unit tested. While these gains are theoretically attainable through any cloud platform, the private cloud provides distinctive advantages.
Start with infrastructure. Traditional testing environments are complicated to use and costly to maintain, taking up excessive space and consuming considerable power despite sitting idle for long periods of time. Firing up testing equipment every few months often leads to difficulty and frustration as teams try to ensure the consistency of all environments. Debugging times lengthen and application delivery is delayed.
Using a private cloud simplifies matters on both the hardware and software fronts. Testing resources can be set up and decommissioned as needed, obviating the need to keep instances running indefinitely on the public cloud. Private cloud virtualization enables compute, storage and networking to be scaled in accordance with application demands and a self-service Web portal can make this process easier. Some private cloud implementations connect to public infrastructure as needed, usually to provide additional automation, scalability or spillover, a phenomenon known as cloud bursting.
Still, would-be adopters may recoil at the perceived high cost of the private cloud, which is sometimes misleadingly set in stark contrast to the pay-as-you-go efficiency of the public cloud. In truth, improvements in hardware and public APIs have made the private cloud more affordable and reliable than ever before. It is no longer a cousin to the expensive, difficult-to-maintain systems that organizations once abandoned for the public cloud; rather, it is an increasingly vital component of modern IT, providing the security, control and performance needed for particular tasks, while also interacting with public platforms to open up a fuller range of services for developers and testers.
Using a Private Cloud Doesn't Mean Never Using the Public Cloud
The very term "private cloud" can be confusing, since it groups together a diverse range of infrastructure and software under one umbrella and sets them in opposition to remotely hosted services. This is a false dichotomy. Not all private clouds are equally capable, nor do they all share compatibility with public clouds as part of hybrid deployments, although the latter trait is becoming more common.
Accordingly, it makes sense to look at the private cloud on a case-by-case basis and assess them as tools and platforms for dev/test. Many of them can easily coexist with both legacy technologies and public cloud platforms to create an ideal environment for software development and deployment.
Public cloud providers have certainly taken note, with several of them modifying their services so that enterprise customers can, for example, feed information into big data tools from on-premises databases or other platforms. These moves speak to the ongoing demand for private clouds that not only have the most apparent amenities - security, dedicated hardware - but also offer highly efficient ways to handle workloads in different way depending on their requirements and on how they evolve.
More specifically, the private cloud is one way to reduce costs and still maintain advanced continuous integration. With the right combination of software and hardware, a private deployment enables excellent handling of servers and automation, as well as the use of plugins that can launch worker nodes on a public cloud for extra capacity. While organizations vary by the number of connections they make between in-house systems and the public cloud, the power of the private cloud is that it is flexible enough to address a wide range of use cases by applying cloud computing principles to key processes.
As such, the private cloud can be an important part of a hybrid deployment, with capabilities that support certain workflows in ways that the public cloud cannot. Resources can be provisioned quickly, leading to faster time to market. Dev/test can be run at low cost on the private cloud, with the workload shifted unchanged to a public cloud for production. Far from being an expensive relic or the antithesis of convenient self-service cloud computing, the private cloud, as well as the hybrid setups it may be part of, is a powerful platform for producing, testing and deploying software, and one that is often equipped with public APIs for additional versatility.
Startups and Service Providers Are Among Many Organizations with Private Clouds
How does the private cloud look in action? Organizations across many verticals have set up private clouds to make software deployment easy and economical. Private cloud is most readily associated with handling sensitive data for heavily regulated industries such as finance and healthcare, but its use cases are actually varied and particularly relevant for programmers.
It is common for startups to go with the public cloud to get what seems like the point-and-click convenience of unlimited resources. After a while, however, many of these companies realize that they can obtain better speeds on their own dedicated hardware, with the added benefits of better privacy and security compared to public cloud. Despite its rapid evolution, the public cloud is still very opaque, and users may get into situations in which they are overly reliant on average machines with specifications that cannot be adjusted. A private or hybrid cloud is a great way to get the infrastructure that the organization needs in order to reliably run workloads and get the most out of their investments.
While cost and security typically dominate any conversation about the private cloud, organizations should realize that it actually confers many more nuanced benefits, particularly for developers. Savings and data protection are critical, but a productive, transparent and highly configurable dev/test environment sets companies that use the private cloud up for long-term success in an increasingly complex cloud landscape.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Jul. 27, 2016 10:30 PM EDT Reads: 2,008
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Jul. 27, 2016 10:30 PM EDT Reads: 1,136
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 27, 2016 10:30 PM EDT Reads: 1,397
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 27, 2016 10:15 PM EDT Reads: 1,397
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 27, 2016 10:00 PM EDT Reads: 2,648
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 27, 2016 09:45 PM EDT Reads: 1,992
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 27, 2016 09:45 PM EDT Reads: 2,163
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 27, 2016 09:30 PM EDT Reads: 264
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 27, 2016 08:00 PM EDT Reads: 2,027
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 27, 2016 07:30 PM EDT Reads: 1,084
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jul. 27, 2016 06:45 PM EDT Reads: 2,058
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Jul. 27, 2016 06:00 PM EDT Reads: 1,678
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Jul. 27, 2016 05:45 PM EDT Reads: 872
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 27, 2016 04:30 PM EDT Reads: 1,864
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 27, 2016 04:30 PM EDT Reads: 1,624
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 27, 2016 04:15 PM EDT Reads: 1,134
[session] Throwing Away the Codebase By @ReadyTalk | @CloudExpo #Cloud #DevOps #DigitalTransformation
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Jul. 27, 2016 04:00 PM EDT Reads: 1,059
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 27, 2016 04:00 PM EDT Reads: 1,525
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 27, 2016 04:00 PM EDT Reads: 1,741
[webcast] Mastering DevOps Automation | @DevOpsSummit @IBMDevOps #IBM #Cloud #DevOps #ContinuousDelivery
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?
Jul. 27, 2016 01:30 PM EDT Reads: 285