Cloud computing and Virtualization are both technologies that were developed to maximize the use of computing resources while reducing the cost of those resources. They are also mentioned frequently when discussing high availability and redundancy. While it is not uncommon to hear people discuss them interchangeably; they are very different approaches to solving the problem of maximizing the use of available resources. They differ in many ways and that also leads to some important considerations when selecting between the two.
Virtualization: More Servers on the Same Hardware
It used to be that if you needed more computing power for an application, you had to purchase additional hardware. Redundancy systems were based on having duplicate hardware sitting in standby mode in case something should fail. The problem was that as CPUs grew more powerful and had more than one core, a lot of computing resources were going unused. This obviously cost companies a great deal of money.
Enter virtualization. Simply stated, virtualization is a technique that allows you to run more than one server on the same hardware. Typically one server is the host server and control the access to the physical server’s resources. One or more virtual servers then run within containers provided by the host server. The container is transparent to the virtual server so the operating system does not need to be aware of the virtual environment. This allows server to be consolidated which reduces hardware costs. Less physical servers also means less power which further reduces cost.
Most virtualization systems allow the virtual servers to be easily moved from one physical host to another. This makes it very simple for system administrators to reconfigure the servers based on resource demand or to move a virtual server from a failing physical node.
Virtualization helps reduce complexity by reducing the number of physical hosts but it still involves purchasing servers and software and maintaining your infrastructure. It’s greatest benefit is reducing the cost of that infrastructure for companies by maximizing the usage of the physical resources.
Cloud Computing: Measured Resources, Pay for What You Use
While virtualization may be used to provide cloud computing, cloud computing is quite different from virtualization. Cloud computing may look like virtualization because it appears that your application is running on a virtual server detached from any reliance or connection to a single physical host. And they are similar in that fashion. However, cloud computing can be better described as a service where virtualization is part of a physical infrastructure.
Cloud computing grew out of the concept of utility computing. Essentially, utility computing was the belief that computing resources and hardware would become a commodity to the point that companies would purchase computing resources from a central pool and pay only for the amount of CPU cycles, RAM, storage and bandwidth that they used. These resources would be metered to allow a pay for what you use model much like you buy electricity from the electric company. This is how it became known as utility computing.
It is common for cloud computing to be distributed across many dedicated servers. This provides redundancy, high availability and even geographic redundancy. This also makes cloud computing very flexible. It is easy to add resources to your application. You just use them, just like you just use the electricity when you need it. Cloud computing has been designed with scalability in mind.
The biggest drawback of cloud computing is that, of course, you do not control the servers. Your data is out there in the cloud and you have to trust the provider that it is safe. Many cloud computing services offer SLAs that promise to deliver a level of service and safety but it is critical to read the fine print. A failure of the cloud service could result in a loss of your data.
Which One is Right for My Application?
How do you decide whether you need virtualization or cloud computing? They both can save money but they do it in different ways. One key consideration is when do you need to save the money. If you use virtualization, you will have a great deal of upfront cost. A new application will need servers and you’ll have to purchase the infrastructure for it. Virtualization means you’ll be spending less upfront and you will save money over time, but there is still going to be a large amount of capital spent early on. Cloud computing works in just the opposite fashion. You new application may not need many resources initially so cloud computing will likely cost very little in the beginning. However, as your application becomes popular and uses more resources, paying by the resource may become more expensive than using virtual servers on your own infrastructure.
Another important consideration is how safe will your data be. Are you comfortable with the cloud computing vendor? In a virtualized environment, you data is on your own hardware. You know who has access, where it is and how its being backed up. You also know exactly how you’ll handle a disaster recovery scenario. Cloud computing, on the other hand, places more that control in the hands of the vendor. While you’ll likely have a SLA to fall back on, it may not be enough. Last year, Microsoft had a failure in a data center that provided cloud computing services for T-Mobile’s Sidekick service. This failure resulted in the loss of customer data and a huge blow to T-Mobile’s reputation. While the SLA will likely provide some monetary compensation to T-Mobile, it cannot repair their reputation with the customers who lost data. You’ll want to consider carefully whether the SLA will cover all your bases as well.
Virtualization and cloud computing are both ways to reduce infrastructure cost by maximizing the utilization of computing resources. They are not the same thing however. Virtualization allows server consolidation by hosting many servers on a single piece of hardware where cloud computing is a service that delivers computer resources on a metered pay-as-you-go model. While they both have advantages, you’ll want to think about factors like start up cost versus long term costs and the possible loss of control of your infrastructure when deciding which model to utilize.
Help us spread the word!
If you liked this article, consider enrolling in one of these related courses:
|Mar 11||Cloud Computing Introduction|
|Mar 25||Virtualization Introduction|
|May 13||Cloud Computing Introduction|
|- Classroom - Online|