Re: Dump/restore with bad data and large objects

Поиск
Список
Период
Сортировка
От John T. Dow
Тема Re: Dump/restore with bad data and large objects
Дата
Msg-id 200808251658.m7PGwX45088951@web2.nidhog.com
обсуждение исходный текст
Ответ на Re: Dump/restore with bad data and large objects  (Tom Lane <tgl@sss.pgh.pa.us>)
Список pgsql-general
Tom

My mistake in not realizing that 8.1 and later can dump large objects in the plain text format. I guess when searching
foranswers to a problem, the posted information doesn't always specify the version. So, sorry about that. 

But the plain text format still has serious problems in that the generated file is large for byte arrays and large
objects,there is no ability to selectively restore a table, and bad data still isn't detected until you try to restore. 

Or did I miss something else?

John

PS: Yes, I know you can pipe the output from pg_dumpall into an archiver, but it's my understanding that the binary
datais output in an inefficient format so even if zipped, the resulting file would be significantly larger than the
customformat. 



On Mon, 25 Aug 2008 12:14:41 -0400, Tom Lane wrote:

>"John T. Dow" <john@johntdow.com> writes:
>> If you dump in plain text format, you can at least inspect the dumped
>> data and fix it manually or with iconv. But the plain text
>> format doesn't support large objects (again, not nice).
>
>It does in 8.1 and later ...
>
>> Also, neither of these methods gets information such as the roles,
>
>Use pg_dumpall.
>
>            regards, tom lane



В списке pgsql-general по дате отправления:

Предыдущее
От: Ivan Sergio Borgonovo
Дата:
Сообщение: Re: playing with catalog tables limits? dangers? was: seq bug 2073 and time machine
Следующее
От: "Scott Marlowe"
Дата:
Сообщение: Re: SERIAL datatype