Cloud Computing vs. Virtualization

April 23rd, 2010 Leave a comment 6 comments
Like the article?

Cloud computing and Virtualization are both technologies that were developed to maximize the use of computing resources while reducing the cost of those resources. They are also mentioned frequently when discussing high availability and redundancy. While it is not uncommon to hear people discuss them interchangeably; they are very different approaches to solving the problem of maximizing the use of available resources. They differ in many ways and that also leads to some important considerations when selecting between the two.

Virtualization: More Servers on the Same Hardware

Virtualization

It used to be that if you needed more computing power for an application, you had to purchase additional hardware. Redundancy systems were based on having duplicate hardware sitting in standby mode in case something should fail. The problem was that as CPUs grew more powerful and had more than one core, a lot of computing resources were going unused. This obviously cost companies a great deal of money.

Enter virtualization. Simply stated, virtualization is a technique that allows you to run more than one server on the same hardware. Typically one server is the host server and control the access to the physical server’s resources. One or more virtual servers then run within containers provided by the host server. The container is transparent to the virtual server so the operating system does not need to be aware of the virtual environment. This allows server to be consolidated which reduces hardware costs. Less physical servers also means less power which further reduces cost.

Most virtualization systems allow the virtual servers to be easily moved from one physical host to another. This makes it very simple for system administrators to reconfigure the servers based on resource demand or to move a virtual server from a failing physical node.

Virtualization helps reduce complexity by reducing the number of physical hosts but it still involves purchasing servers and software and maintaining your infrastructure. It’s greatest benefit is reducing the cost of that infrastructure for companies by maximizing the usage of the physical resources.

Cloud Computing: Measured Resources, Pay for What You Use

Cloud Computing

While virtualization may be used to provide cloud computing, cloud computing is quite different from virtualization. Cloud computing may look like virtualization because it appears that your application is running on a virtual server detached from any reliance or connection to a single physical host. And they are similar in that fashion. However, cloud computing can be better described as a service where virtualization is part of a physical infrastructure.

Cloud computing grew out of the concept of utility computing. Essentially, utility computing was the belief that computing resources and hardware would become a commodity to the point that companies would purchase computing resources from a central pool and pay only for the amount of CPU cycles, RAM, storage and bandwidth that they used. These resources would be metered to allow a pay for what you use model much like you buy electricity from the electric company. This is how it became known as utility computing.

It is common for cloud computing to be distributed across many dedicated servers. This provides redundancy, high availability and even geographic redundancy. This also makes cloud computing very flexible. It is easy to add resources to your application. You just use them, just like you just use the electricity when you need it. Cloud computing has been designed with scalability in mind.

The biggest drawback of cloud computing is that, of course, you do not control the servers. Your data is out there in the cloud and you have to trust the provider that it is safe. Many cloud computing services offer SLAs that promise to deliver a level of service and safety but it is critical to read the fine print. A failure of the cloud service could result in a loss of your data.

Which One is Right for My Application?

How do you decide whether you need virtualization or cloud computing? They both can save money but they do it in different ways. One key consideration is when do you need to save the money. If you use virtualization, you will have a great deal of upfront cost. A new application will need servers and you’ll have to purchase the infrastructure for it. Virtualization means you’ll be spending less upfront and you will save money over time, but there is still going to be a large amount of capital spent early on. Cloud computing works in just the opposite fashion. You new application may not need many resources initially so cloud computing will likely cost very little in the beginning. However, as your application becomes popular and uses more resources, paying by the resource may become more expensive than using virtual servers on your own infrastructure.

Another important consideration is how safe will your data be. Are you comfortable with the cloud computing vendor? In a virtualized environment, you data is on your own hardware. You know who has access, where it is and how its being backed up. You also know exactly how you’ll handle a disaster recovery scenario. Cloud computing, on the other hand, places more that control in the hands of the vendor. While you’ll likely have a SLA to fall back on, it may not be enough. Last year, Microsoft had a failure in a data center that provided cloud computing services for T-Mobile’s Sidekick service. This failure resulted in the loss of customer data and a huge blow to T-Mobile’s reputation. While the SLA will likely provide some monetary compensation to T-Mobile, it cannot repair their reputation with the customers who lost data. You’ll want to consider carefully whether the SLA will cover all your bases as well.

Virtualization and cloud computing are both ways to reduce infrastructure cost by maximizing the utilization of computing resources. They are not the same thing however. Virtualization allows server consolidation by hosting many servers on a single piece of hardware where cloud computing is a service that delivers computer resources on a metered pay-as-you-go model. While they both have advantages, you’ll want to think about factors like start up cost versus long term costs and the possible loss of control of your infrastructure when deciding which model to utilize.

For a more in-depth coverage of cloud computing and virtualization, check out our Intro to Cloud Computing and Intro to Virtualization training courses.

Help us spread the word!
  • Twitter
  • Facebook
  • LinkedIn
  • Pinterest
  • Delicious
  • DZone
  • Reddit
  • Sphinn
  • StumbleUpon
  • Google Plus
  • RSS
  • Email
  • Print
If you liked this article, consider enrolling in one of these related courses:
Don't miss another post! Receive updates via email!

6 comments

  1. Cynthia Ho says:

    Hi, I am trying to understand server virtualization and cloud computing.

    I think both of them are standalone technologies/concept. End users can either choose to virtualize their servers or adopt cloud computing. Virtualizing servers can be costly at the beginning because of the costly virtualization licenses. However, due to security concern over cloud computing, End users tend to virtualize servers internally instead of going to cloud. By the end of the day, the capital spend on virtualization is still considered capital investment when end users need to buy licenses for their own. But, for cloud computing which is considered operating expenses when end users only pay what they use.

    Comparing cloud computing and virtualization is similar to a car and its engine. virtualization drives cloud computing. Hosters are the one who provide cloud computing, hence, virtualization is very much helpful for them. But, to end users like manufacturing, BFI..etc, virtualization is only helping them to utilize their servers resources but not related to the adoption of cloud..in other words, end users who wanted to go for cloud computing need not virtualize their own servers.

    as mentioned earlier that security issue is still a concern for large enterprise, going to cloud option will be more applicable to small businesses which less emphasize on security concern. Therefore, moving forward, server virtualization will become more popular to large enterprises and medium businesses. And cloud computing is good to market to small businesses who don’t need to invest capital cost for business and pay as what they use as utility.

    Please correct me if I am wrong. Thank for helping me understand better.

  2. Abdulkarim says:

    Thank you so much. It is such a nice article. Actually, I was not conscious about the difference of the two; cloud computing and visualization. However, we could consider that virtualization is one of the core technologies used in cloud computing. Currently, I am kind of confused as there is no point of using virtualization by big companies such as Microsoft. In other word, may some tell the difference of hosting “Hotmail” mail service on virtual servers and real servers?

    Thank you

  3. Abdulkarim says:

    Thank you Cynthia Ho for your explanation.

  4. Venkatesh M says:

    Hey this is a very nice article which improved and understand the difference between Virtualization and Cloud Computing. But still i required some clarifications on these environments.

    Can anyone tell me the Pre-requisites and Validations between Virtual and Cloud environments ?

  5. Jan Becker says:

    Virtualization can help with resource allocation. The lead virtualization server will know how much resources ared used on each server and will be better able to place new requests on the appropriate server.

    Question: Cloud computing, how does software licensing work on the cloud. Does the cloud owner by licenses and charge by use or will the systems still have to pay for the licenses?

Comment