Remember those big chunky CRT screens that had an image burned into it? While that made a moment of smurk on my face, it was a real hassle. Screensavers were originally introduced to avoid that phenomenon. One could ask to turn off the screen, but we all know, that didn't happen anyway.
Or how about the windows XP screensaver? The microsoft logo floating around like it is on clouds. I personally liked the 3d pipes screensaver. Snake gone menace in a 3D way. Every time I see that screensaver, something feels nostalgic about it and I simply can't suppress a smile.
As all things change, screensavers changed for the better. There is a good power management plan on every PC nowadays. It saves power, improves security and increases the life of your IT material.
No questions asked, end-user material should have this implemented. But what about your servers? And especially the virtual machines?
So what's the deal with screensavers on virtual machines ? They're annoying, serve no purpose and are a performance hog. Yes that's right, they're eating up your performance.
It may seem like a little deal, but it's bigger than you think.
Screensavers on virtual machine servers eat up
How Much? As with everything: it depends. How much virtual machines do you have per host? Suppose it's 1% a machine, and you have 15 machines running per host. That's 15% of CPU and memory lost for no reason.
Easy: disable them. Unlike end-user machines they serve no purpose here.
That rises another question. How to disable them, and what should you take into consideration?