Developers tend to like their machines as big and as fast as they can get them. Who would blame you if you spent all day with your tools that you’d get the best tools possible?
Nobody will, but there is a problem inherent in owning the latest and the greatest hardware and software in order to develop mainstream applications.
The problem is that the majority of your users will be one or even two generations behind the machine that you develop with and most likely test on. Now for some jobs (long compiles) a fast machine (or even several of them for a distributed build) come in really handy. But when it comes to evaluating how your software performs in real world settings it helps to have something that is several generations behind the state of the art on or near your desk. Use the software you produce exclusively on anemic hardware (less memory, slower disk access and most important: a lower clock frequency), and if you’re building web based stuff, use a browser that is one or even two generations behind what’s new.
Your end users and your support department will be most grateful to you. End users because they’ll find that the software performs well on their machines (and powerusers will be happy too because on their computers things will truly fly).
The support department because they’ll get fewer calls from users that would have to upgrade to either modern hardware or newer software in order to run your application.
My personal ‘benchmark’ machine for the stuff we make is a small and slow netbook driving a screen of reasonable resolution. The whole setup cost a few hundred bucks. It’s saved me many times from implementing some glitzy feature that brought the site to a screeching halt due to hidden expenses in terms of CPU or memory consumption on the client side.
On my development box I would have never ever noticed the impact but on the netbook the effect is immediate.
A nice example of a website that could do a lot better in this respect is twitter.com.
They’ve now forced all their users to the ‘new’ interface, but frankly imo it sucks. It is slow as molasses on anything but the fastest machine here running the latest release of chrome. On all the other machines it is so slow as to be pretty much unusable. And since they no longer allow you the option to use the old interface (ok, that wasn’t a speed daemon either but at least it was a lot faster than the ‘new and improved’ version) you’re now stuck with this fancy and very slow implementation.
Slower machines have good uses, keeping you sharp as a developer is one of those and you could do a lot worse than setting up a machine that’s two generations older than ‘current’ and making your software work with it.
This is solidly in the ‘eat your own dogfood’ category, but it goes one step further, not only should you eat your own dogfood, you should eat it with the same cutlery the other dogs use to get your own experience to match theirs as closely as possible. If you’ve never experienced your own software on non ‘state-of-the-art’ hardware the experience can be quite an eye opener.
Hacker news discussion sparked by this post, some excellent suggestions on the same topic in there: http://news.ycombinator.com/item?id=2911935