It just depends on what is actually happening inside the VM. If you run a virtual render farm with 3D rendering cranking away inside every guest, the host CPU will get pegged. If you just fire up a VM and don’t actually have any activity running, it won’t consume significant CPU time. Pretty much the same as any software, really.
Open up something more mundane than VM software, like Microsoft Excel, and it’ll just sit there using RAM. Recalculate every cell if a giant spreadsheet with complex expressions, and it will peg the CPU.
Well, I run medical servers, and needing high availability, every program is loaded and always running. And there’s a LOT of programs. Patient charts, insurance, x-ray machines, online forms, billing, accounting, antivirus, etc etc. When a program is running, it’s loaded in ram. 98% of the time the machine is just sitting there, waiting for a request. They don’t do much heavy lifting like video transcoding or weather modeling. Just shoving data out to the terminals. The hard drives are what really get beat up.
Well, and each VM is running a full operating system, that has to be loaded into memory. You really start burning ram when you get into virtualization. Its not really a server specific trait, but server CPUs can handle magnitudes more memory for this reason.
Latest Answers