“Any problem in computer science can be solved with another layer of indirection. But that usually will create another problem.” —David Wheeler or Butler Lampson, depending on whom you ask
I’ve seen a fair amount of press about Virtualization over the past year or so. For servers, I think it makes some sense: it would certainly be convenient to have root access to a virtual server on my shared web host instead of a limited user account. But when people talk about virtualization on the desktop, I wonder: is this going to far? I already believe that most of the exponential increases in hardware capacity have been eaten up by abstractions created for the benefit of programmers. This is not necessarily a bad thing, as it gives us more sophisticated software. But it also means that each new computer I buy feels slower than the last one.
How many levels of abstraction are too many? When you have an interpreted programming language built with dynamic libraries running on a virtual machine on a virtual server, a simple “+” instruction might go through six different transformations before it hits the processor. Is this a good thing?