Re: optimizing import of large CSV file into partitioned table?

Поиск
Список
Период
Сортировка
От Dimitri Fontaine
Тема Re: optimizing import of large CSV file into partitioned table?
Дата
Msg-id 877hovp7mw.fsf@hi-media-techno.com
обсуждение исходный текст
Ответ на optimizing import of large CSV file into partitioned table?  (Rick Casey <caseyrick@gmail.com>)
Ответы Re: optimizing import of large CSV file into partitioned table?  (Rick Casey <caseyrick@gmail.com>)
Список pgsql-general
Rick Casey <caseyrick@gmail.com> writes:

> So, I am wondering if there is any to optimize this process? I have been using Postgres for several years, but have
neverhad to partition or optimize it for files 
> of this size until now. 
> Any comments or suggestions would be most welcomed from this excellent forum.

The pgloader tool will import your data as batches of N lines, you get
to say how many lines you want to consider in each transaction. Plus,
you can have more than one python thread importing your big file, either
sharing one writer and having the other threads doing the parsing and
COPY, or having N independent threads doing the reading/parsing/COPY.

  http://pgloader.projects.postgresql.org/

Hope this helps,
--
dim

В списке pgsql-general по дате отправления:

Предыдущее
От: Pavel Stehule
Дата:
Сообщение: Re: Splitting text column to multiple rows
Следующее
От: Ole Tange
Дата:
Сообщение: insert into test_b (select * from test_a) with different column order