Welcome!

SDN Journal Authors: Pat Romanski, Patrick Hubbard, Elizabeth White, Sven Olav Lund, Liz McMillan

Related Topics: Containers Expo Blog, Microservices Expo, Microsoft Cloud, Open Source Cloud, @CloudExpo, SDN Journal

Containers Expo Blog: Article

Virtualization – The Easy Way

Virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility

We all know that our computer systems are underutilized. Most new desktops come with far more storage than is going to be used (or, to be more exact - should be used given file servers). Processors desktops and servers are busy only 20% of the time, and we always buy more memory than we need just in case. We are left with spending large amounts of capital for hardware that will sit idle most of the time.

The solution is Virtualization - one of the latest buzzword floating around Information Technology. It seems that we have been talking about virtualization for a while now, but what does it actually mean and how do I get there? The key thing to know is that virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility.

As in all things, we need to be clear on our terms - what is virtualization? In practice, "virtualization" covers many different but related concepts:

  • Server Virtualization. Taking several single purpose servers and having them all run as separate "virtual" servers on the same or set of physical servers
  • Network Virtualization. Virtualizing your network infrastructures so that single pipes (single pieces of copper or fiber) carry many different virtual Local Area Networks (VLANs) and/or different types of traffic (where data networks and storage networks are combined on the same physical pipe)
  • Storage Virtualization. Virtualizing your storage by the use of storage servers. Storage is then divided up and given to the physical servers - where each server then thinks it has its own set of disk drives
  • Desktop Virtualization. Moving desktop processing to central servers (which may or may not themselves be virtualized) and each user is given a virtual desktop

For the purposes of this article, we will focus on Server Virtualization. The other types of virtualization are sufficiently complex as to require their own articles.

At a high level, server virtualization can be accomplished using some very basic steps:

  • Inventory your existing servers - how many are there? For the sake of our examples, let's assume 20 servers today.
  • Compare the peak processor usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' processing times together so that you see where your peak usage is. For our example, let's say we determine that we need 8 CPUs at peak processing time and that the existing servers were all dual core systems - for a total of 40 CPUs.
  • Compare the peak memory usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' memory usage together so that you see where your peak usage is. For our example, let's say we determine that we need 32 GBytes at peak processing time and that current systems had over 80 GBytes in them today.
  • Compare the peak network usage of all 20 servers. If you have good management software, it should be able stack all 20 servers' network usage together so that you see where your peak usage is. For our example, let's say we determine that we need 1 ½ Gigabits/sec at peak network transfer (two network connections). Today we are using 20 network connections for the main traffic, plus another 20 for the management interface.
  • Determine how much actual disk space is being used. For our example, let's say we determine that 7 Terabytes are needed and that the systems have a total of 15 Terabytes worth of storage.

So we know that we using at peak 7 Terabytes of disk space, 32 GBytes of memory, and eight processors - how does that help us? Well, since most sites are achieving between 10 to 20 virtual servers to a physical server - we know that we should be able to reduce our datacenter from 20 servers to two servers. If each server had six CPU cores, 32 Gbytes of memory, and connected to a shared 10 Terabyte storage system, we would meet all of our current needs plus allow for reasonable growth.

I said before that "virtualization is all about making the most efficient usage of your infrastructure while allowing for maximum flexibility." Let's see how well we did.

  • We reduced our server count from 20 servers to two servers. This means that we are no longer trying to power 20 servers, trying to cool 20 servers, trying to cable 20 servers, etc.
  • We reduced the number of CPU cores from 40 to 12. CPUs are expensive - and we just saved a lot of money.
  • We reduced the amount of memory purchased from 80 Gbytes to 64 Gbytes.
  • We reduced the amount of disk space purchased from 15 Terabytes to 7 Terabytes.
  • We reduced the number of network connections from 40 connections (almost its own switch!) to four connections. I should comment that most servers use two network connections to allow for redundancy in case of a network failure - so we went from 60 connections to six.
  • We set up an environment that can accommodate a 50% increase in CPU requirements above today's usage without requiring new servers and migrating applications to new servers.
  • We set up an environment that can accommodate additional memory.
  • We set up an environment that can accommodate 40% increase in storage requirements - without requiring disks to be swapped out when more space is needed.

We were able to utilize server virtualization to reduce our operating costs, maintenance costs, and cooling costs. We were able to utilize server virtualization to create a more nimble architecture that can respond to changes in the business. We successfully created an infrastructure that was effectively utilized and gave us maximum flexibility.

Before we finish, let me give a final message about Desktop Virtualization. Desktop Virtualization can result in significant cost savings in both operating and capital budgets. Imagine the case where a company has 1,000 users. Best practice involves replacing desktops every three years - or 333 desktops a year. At a $1,000 per desktop (adjust up or down based on your company's standard desktop) - that is $333,000 in new PCs every year. At one desktop engineer per each 150 desktops - that is seven desktop engineers. Plus one engineer to just handle the upgrades. Desktop Virtualization would reduce the number of engineers to about two (representing an operating savings of close to $600,000/year) and reduce the capital expenditures to about $50,000. Do you know of any companies that would like to save close to a $1,000,000/year?

More Stories By Dean Nedelman

Dean Nedelman is Director of Professional Services at ASi Networks, a network and voice services firm in City of Industry, Calif. In this role, he supports our major accounts, and oversees the implementation of advanced Cisco solutions and services. He has over 25 years of experience in technical and executive level technology roles, primarily in secure high performance computer environments across multiple industries and sectors. Dean's background includes security, high availability, telephony, advanced network, wide area application services, and storage area networks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, will discuss how they bu...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, will discuss how from store operations...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., will introduce you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He will explore applications in several industries and discuss technologies that allow the deployment of advanced visualization solutions to the cloud.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, will discuss some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he’ll go over some of the best practices for structured team migrat...
SYS-CON Events announced today that Datera will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera offers a radically new approach to data management, where innovative software makes data infrastructure invisible, elastic and able to perform at the highest level. It eliminates hardware lock-in and gives IT organizations the choice to source x86 server nodes, with business model option...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
As people view cloud as a preferred option to build IT systems, the size of the cloud-based system is getting bigger and more complex. As the system gets bigger, more people need to collaborate from design to management. As more people collaborate to create a bigger system, the need for a systematic approach to automate the process is required. Just as in software, cloud now needs DevOps. In this session, the audience can see how people can solve this issue with a visual model. Visual models ha...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...
In the fast-paced advances and popularity in cloud technology, one of the most critical factors revolves around concerns for security of your critical data. How to assure both your company and your customers they can confidently trust and utilize your cloud environment is most often top on the list. There is a method to evaluating and providing security that exceeds conventional modes of protecting data both within the cloud as well externally on mobile and other devices. With the public failure...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...