INSERTing lots of data

Поиск
Список
Период
Сортировка
От Joachim Worringen
Тема INSERTing lots of data
Дата
Msg-id 4BFF8FBF.2050809@iathh.de
обсуждение исходный текст
Ответы Re: INSERTing lots of data  (Szymon Guz <mabewlun@gmail.com>)
Re: INSERTing lots of data  (Craig Ringer <craig@postnewspapers.com.au>)
Re: INSERTing lots of data  (Greg Smith <greg@2ndquadrant.com>)
Список pgsql-general
Greetings,

my Python application (http://perfbase.tigris.org) repeatedly needs to
insert lots of data into an exsting, non-empty, potentially large table.
Currently, the bottleneck is with the Python application, so I intend to
multi-thread it. Each thread should work on a part of the input file.

I already multi-threaded the query part of the application, which
requires to use one connection per thread - cursors a serialized via a
single connection.

Provided that
- the threads use their own connection
- the threads perform all INSERTs within a single transaction
- the machine has enough resources

  will I get a speedup? Or will table-locking serialize things on the
server side?

Suggestions for alternatives are welcome, but the data must go through
the Python application via INSERTs (no bulk insert, COPY etc. possible)

  thanks, Joachim


В списке pgsql-general по дате отправления:

Предыдущее
От: Piotr Kublicki
Дата:
Сообщение: Re: Download
Следующее
От: Szymon Guz
Дата:
Сообщение: Re: INSERTing lots of data