i've got a database with several million records.
i've found that from time to time i need to nuke the primary index in order
to get a vacuum done on it.
i don't like this, but i can live with it.
now, i'm vacuuming it and the backen process is growing to enormous size:
(i was gonna cut and paste part of a top session here, but the vacuum just
blew out the 750M swap partition on the server, and now i need to reboot it).
why does this vacuum require so much core memory?
--
[ Jim Mercer Reptilian Research jim@reptiles.org +1 416 410-5633 ]
[ The telephone, for those of you who have forgotten, was a commonly used ]
[ communications technology in the days before electronic mail. ]
[ They're still easy to find in most large cities. -- Nathaniel Borenstein ]