Give me a gigabyte

Michael Herf
October, 2001

back


Way back when I was little, my dad taught me the important little things that a ten-year-old should know all about (like, say, combinatorics and differential calculus.) One of the things he taught me was the idea, "A chain is as strong as its weakest link."

Then later in computer architecture class, we learned Amdahl's law (a more mathematical way of saying the same thing), that for practical purposes, says you should speed up the slowest thing in a computer before you speed up the fastest.

I'm writing this down after having the third conversation this week where I said, "You know, disks are really the bottleneck in a modern computer." And I do think the world would be a faster place for computing if disks were a lot faster.

But they aren't, and they don't seem to be getting that much faster. Someday we'll have a completely new technology to replace hard disks, but not that soon. I don't think we can speed up the "weakest link" right now.

So, instead, I'm going to do a little bit of judo here, contradict all those things I just said, and just say:

New computers should have at least a gigabyte of RAM.

In a computer system, some components can cover for the other ones - it's not a totally dependent "chain" - in fact, you can have RAM cover for a slow hard drive. It's called caching.

But, the resources must be there to do it.

So, without the emergence of faster hard drives, or failing some new type of non-volatile storage, adding more RAM can help a lot. But only if the software can take advantage.

In my estimation, lots of "almost-solved" applications on a computer today should use about 100-200 MB.

For instance, say you want to instantly search your hard drive for any keyword? This takes about 100MB. Say you want to cache the file structure so that dealing with the filesystem is faster? About 50MB. Say you want to edit a few minutes of high-quality video, compressed? A few hundred MB as well.

I really want to explain: things that take a long time on a modern computer, and some things that simply aren't possible - a decent fraction of these conditions are because the processor spends about 90% of its time waiting on other system components. There are very few tasks today that shouldn't be instantaneous.

The truth is that application writers are still making apps that use about 10MB, but RAM is more than 10x cheaper than when these applications were written. The modern computer is a shared resource, so you simply can't write an application (unless it's Photoshop) that requires 100MB of RAM to run.

But is there any reason for this? No.

Two years ago when I bought 128MB chips for my computer, they cost about $200 apiece. That's $1600/gigabyte. Now, two 512MB chips can be had for $24 each, or $48/gigabyte. It's simply criminal that OEMs and systems integrators are spending hundreds of dollars on the latest processor, when the state of software would be radically advanced by $40 of RAM.

Today, Dell is selling a 1.6GHz P4 with 128MB of RAM. This, clearly, is enough to run Windows XP, Outlook, and Internet Explorer together, but not much else.

Isn't this dumb? A machine with more RAM would provide a better experience (not for benchmarking, but for loading applications, switching applications, etc.) even on software today.

And the longer-term truth is that many applications would simply be better if they were engineered to use 100MB all the time.

I really believe that application writers should start asking, "What if I were running on a machine with a few GB of RAM free? How could I improve what I'm doing if that were true?"

Then write an application that, yes, uses 10MB, but given some suitable conditions, can efficiently use 100MB. (Photoshop is to be commended on this point.) This is not "bloat" - it's trading speed for size, getting around the hard-drive bottleneck, and it's the way the next generation of applications should be.