I've been somewhat reluctant to start using PostgreSQL for anything interesting because it's a (relative) pain to back up and upgrade. However, since I'd like to start keeping a database of my books at some point, I need to do something about this. A backup approach that I'd be happy with would be to take a full dump of the database every month or so, and store diffs between dumps (which should be reasonably small) every day, automatically compressing the results. A quick look around didn't show any ready-made tool for doing this, but it should be reasonably easy to knock up in Python or shell script.
A more complex version of this tool would support multiple databases using a single dump format (perhaps in XML), so it would be easy to migrate and replicate data between different systems. The common dump format could also be used to distribute public databases.