In the last few weeks I have been notebook shopping for my dad. My budget was a meager $700 and my task to get near top of the line.I finally arrived at a Dell Vostro notebook. As I finalized the order on a Vostro 1500 for $694.45, I thought about my original notebook purchased in 2002. It cost me over $1500 and is probably 1/10 of the power of this new machine.
Back in 2002, the story was essentially the same. My old machine ran Windows XP, and its primary purpose was surfing, chatting, checking email, and doing some work-centric tasks. From a fully booted desktop (no HDD activity) I could launch my browser, and about 5 seconds later, it would be ready for me to type.
Hell, back in 1996, the story was the same. I ran Windows 98 on my home desktop, and I would launch my browser (Netscape I believe) and it would take about 5 seconds before it was ready for use.
Here we are, a decade later (eons in the computing world) and I find myself still waiting on the machine to be done with its tasks. Why?
I recently read an interview with Con Kolivas, a former Linux kernel maintainer who recently quit. He cited his reason as the continuing decrease in the performance of the desktop. His argument was very compelling:
“Computers of today may be 1,000 times faster than they were a decade ago, yet the things that matter are slower.
The standard argument people give me in response is ‘but they do such more these days it isn’t a fair comparison’. Well, they’re 10 times slower despite being 1000 times faster, so they must be doing 10,000 times as many things. Clearly the 10,000 times more things they’re doing are all in the wrong place.”
Its true – we have reached stagnation. Where is the innovation, and the increases in speed that we see in Hollywood movies? We are fighting an ugly mathematical curve. As computers become more powerful, they increase in complexity, taking processes longer to finish. We aren’t pulling ahead anymore. In fact, we are losing ground.
With 20 years under its belt, you would think that Microsoft would have produced a lean, mean Vista, one that boots instantly, and runs anything almost instantly. The world was introduced to a monolithic nightmare that makes even the most powerful machines seem antiquated. Why is anyone choosing this direction? Is the *unix mindset of simple, highly specialized utilities dead? Ten years from now, will my browser take 5 seconds to load?
Stop using stupid software. You continue to support so therefore it must do what it is supposed to do. It is a complete trade off. Yes, I could run The GIMP and have it load almost instantly, but at what cost? The loss of the Adobe font support, Lab color, full CMYK color space support, integration with other apps such as Final Cut and InDesign. All of these trade offs push me to use a slower (load time only) application.
I spent years working with the lower complexity, quicker apps and I have to say that sometimes a loss in speed is worth the complexity under the app. Sometimes, in the case of scp/sftp, the lighter, quicker app is the clear choice for me. The command line wins any day over things like FUGU.
I think that some areas have improved greatly though. On the same machine, my Photoshop CS2 startup is 45 seconds, while CS3 is around 20 seconds. 20 seconds is not fast by any means, but it is a huge improvement.
LikeLike