best way to write large data-streams quickly?

Поиск
Список
Период
Сортировка
От Mark Moellering
Тема best way to write large data-streams quickly?
Дата
Msg-id CAA0uU3XCiReRsK9-4Zsk0Mdhan1dG2Q4dj6iPQiEc-kOJumLyw@mail.gmail.com
обсуждение исходный текст
Ответы Re: best way to write large data-streams quickly?  (Steve Atkins <steve@blighty.com>)
Список pgsql-general
Everyone,

We are trying to architect a new system, which will have to take several large datastreams (total of ~200,000 parsed files per second) and place them in a database.  I am trying to figure out the best way to import that sort of data into Postgres. 

I keep thinking i can't be the first to have this problem and there are common solutions but I can't find any.  Does anyone know of some sort method, third party program, etc, that can accept data from a number of different sources, and push it into Postgres as fast as possible?

Thanks in advance,

Mark Moellering

В списке pgsql-general по дате отправления:

Предыдущее
От: Tom Lane
Дата:
Сообщение: Re: algo for canceling a deadlocked transaction
Следующее
От: Steve Atkins
Дата:
Сообщение: Re: best way to write large data-streams quickly?