Обсуждение: Dump format for long term archiving.

Поиск
Список
Период
Сортировка

Dump format for long term archiving.

От
Ron Mayer
Дата:
If one wanted to dump some postgres databases for long term
archival storage (maybe decades), what's the recommended
dump format?   Is the tar or plain text preferred, or is
there some other approach (xml? csv?) I should be looking
at instead?

Or should we just leave these in some postgres
database and keep upgrading it every few years?

Re: Dump format for long term archiving.

От
Tom Lane
Дата:
Ron Mayer <rm_pg@cheapcomplexdevices.com> writes:
> If one wanted to dump some postgres databases for long term
> archival storage (maybe decades), what's the recommended
> dump format?

Plain text pg_dump output, without question.  Not only is it the most
likely to load without problems, but if necessary you could fix it with
a text editor.

            regards, tom lane

Re: Dump format for long term archiving.

От
brian
Дата:
Ron Mayer wrote:
> If one wanted to dump some postgres databases for long term
> archival storage (maybe decades), what's the recommended
> dump format?   Is the tar or plain text preferred, or is
> there some other approach (xml? csv?) I should be looking
> at instead?
>
> Or should we just leave these in some postgres
> database and keep upgrading it every few years?
>

The version you dump it from is unlikely to be difficult to find ten
years from now. I'd just make sure to append the pg version to the
archive so it's obvious to any future data archaeologists what's needed
to breathe life back into it.

b

Re: Dump format for long term archiving.

От
"Andrej Ricnik-Bay"
Дата:
On 14/03/2008, brian <brian@zijn-digital.com> wrote:
> The version you dump it from is unlikely to be difficult to find ten
>  years from now. I'd just make sure to append the pg version to the
>  archive so it's obvious to any future data archaeologists what's needed
>  to breathe life back into it.
Let me play devils advocate here ...

While the source for PG 8.x will be around there's no guarantee
that future enhancements to gcc (or whatever commercial compiler
you'll be using) will still allow you to compile it w/o potentially long-
winded modifications to the original source.

My gut-feeling is that trying to keep data as a "moving target", with
some redundancy in terms of storage and hardware, and updating
the appropriate means every few years (financial life-cycle?) is a
sensible method  :}


Cheers,
Andrej



--
Please don't top post, and don't use HTML e-Mail :}  Make your quotes concise.

http://www.american.edu/econ/notes/htmlmail.htm