This will become popular

Story: Is virtualization the installation miracle cure?Total Replies: 1
Author Content
Skapare

Mar 30, 2006
12:19 PM EDT
This will become popular. Pretty soon, we'll have dozens, maybe even hundreds, of virtualization packages to choose from. Now we'll just need to make sure the software we want to run supports the virtualization package we are using. D'oh!

Seriously, as long as the list of virtualization environments is reasonably small, this can be useful. I like Xen. And User Mode Linux is also good. But one major problem is that this can also use up a lot of space. That 500 GB disk is going to be used up before you know it.

Two major classes of compatibility issues exist. One is the hardware level. When OSes are done right, this should be fairly invisible, aside from the CPU instruction set architecture issues (e.g. are we on x86 or PPC or whatever). Let's talk x86 since it's the largest base of commodity machines. There are differences, but the OS should have that all abstracted to its API (hopefully a standard one). The other issue is the OS itself, counting different Linux distributions as separate OSes.

I went through a nightmare installing a major customer support application package once. I would describe it as dependency hell. It had dozens of picky version requirements for Perl modules and such. Eventually it was solved, but not without a huge set of changes to the system. If that machine were being used for other major applications, those might have been broken by the OS changes. That's where virtualization comes in. But it's not so much to make a uniform hardware environment as it is for making multiple ... different ... OS environments.

I've always recommended that each critical service at a business be run on a different machine. Sometimes you may be forced to do an emergency upgrade of software, such as to cure a security issue. That might involve OS changes (the new version of the software which hadn't been upgraded in months now requires new library versions, etc). By having each application on a different machine, at least you won't impact the other applications when these changes have to be made in a hurry. You don't want your web server to die because it doesn't play well with a new resolver library your mail server now requires in it's new secure version.

But separate machines for everything is costly. So in the end you either spend too much, or you end up combining applications anyway. With a virtualizer like Xen, you have as many machines as you have disk space to host. There's still more cost (more disk space, etc) over not separating, but less cost compared to full blown separation (you might need 4 to 8 machines just to properly separate all basic internet services and a database even in a 1-2 person business).

Now I just wish I had the space to put Xen on. My 4 home computers have 120GB, 120GB, 30GB, and 30GB repsectively, of disk space. And each is already jam packed full. And the BIOS on each won't support larger disks and the power supplies won't support more disks. So it looks like a new machine will eventually have to arrive.
rob

Mar 30, 2006
9:24 PM EDT
Agreed!

Today I was talking with someone regarding setting up a new service, but I have to make sure that the service wont eat all the resources of the machine (the software is known for eating resources), now putting the service on a "separate" machine solves the problem and this is what we want to do!! Thanks to (para)virtualization I just have to get some more ram and setting up snapshots for backup purposes!! Great stuff!!

Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]

Becoming a member of LXer is easy and free. Join Us!