importing large files

Поиск
Список
Период
Сортировка
От olivier.scalbert@algosyn.com
Тема importing large files
Дата
Msg-id 1190967769.434956.11760@o80g2000hse.googlegroups.com
обсуждение исходный текст
Ответы Re: importing large files  (Dimitri Fontaine <dfontaine@hi-media.com>)
Список pgsql-general
Hello,

I need to import between 100 millions to one billion records in a
table. Each record is composed of  two char(16) fields. Input format
is a huge csv file.I am running on a linux box with 4gb of ram.
First I create the table. Second I 'copy from' the cvs file. Third I
create the index on the first field.
The overall process takes several hours. The cpu seems to be the
limitation, not the memory or the IO.
Are there any tips to improve the speed ?

Thanks very much,

Olivier


В списке pgsql-general по дате отправления:

Предыдущее
От: "detrox@gmail.com"
Дата:
Сообщение: how to ignore invalid byte sequence for encoding without using sql_ascii?
Следующее
От: Goboxe
Дата:
Сообщение: Partitioned table limitation