Part of my webhosting woes of late is that, like Wordpress, my blog software (Serendipity) is horribly dependent upon massive amounts of database interaction. A few hits and suddenly your server starts leaking smoke and your webhosting provider shuts you down. This is obviously not optimal, and a big part of the reason I made the switch to Dreamhost's Virtual Private Server platform in the first place. Dreamhost's VPS came with its own host of problems, but needless to say, database server overloading was still somewhat of a problem there. (Actually, I maintain that it is MORE of a problem, but that's something you'll have to take up with your helpful Dreamhost Support staff -- you know, if they ever get back to you.)
Because this is such a problem on Wordpress there are quite a number of different plugins to alleviate this, all of which have pros and cons, meaning that it takes some research to determine how best to go about caching your content. Being a lesser-used platform than Wordpress, Serendipity doesn't have quite the number of solutions to help cache the content and relieve the database server from some of its tireless work, so I set about trying to come up with a solution.
While reading up on some of these plugin-based solutions on Wordpress's platform, trying to see if there was some technique I could move over to Serendipity, I stumbled across an ingenious idea:
Squid is designed to sit between your browser and the rest of the internet, efficiently caching content so that your browser doesn't have to re-download it all the time, effectively "speeding up" your internet connection. This is especially helpful in a network situation where you have a number of people using the same internet connection. When Bob loads up the day's LOLcats, squid will cache them locally so that Steve's computer doesn't have to download them yet again. Pretty common setup, and very effective at what it does.
Anyway, someone far cleverer than I realized that one could use squid the other way 'round: by running it on your web host, intercepting every incoming request seeking content from your site. Squid will happily pass requests through to the web server, where content will be fetched from your database. But if squid knows it has a recent copy in memory, it will send that back to the browser instead, never letting it touch your webserver -- and thus any databases -- saving yourself some pretty significant amount of RAM and CPU.
I'm here to report that it works like a charm. Granted, there are quirks -- like comments not immediately displaying -- but that's a trade-off I'm more than willing to make.
If you'd like more information, here are the instructions I followed to get this setup running on my new host: Reverse Proxying with Squid.