Re: Using Postgres to store high volume streams of sensor readings

Поиск
Список
Период
Сортировка
От David Wilson
Тема Re: Using Postgres to store high volume streams of sensor readings
Дата
Msg-id e7f9235d0811221416i60b42635x820369ffcbf64aa7@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Using Postgres to store high volume streams of sensor readings  ("Ciprian Dorin Craciun" <ciprian.craciun@gmail.com>)
Список pgsql-general
On Sat, Nov 22, 2008 at 4:54 PM, Ciprian Dorin Craciun
<ciprian.craciun@gmail.com> wrote:
> On Sat, Nov 22, 2008 at 11:51 PM, Scott Marlowe <scott.marlowe@gmail.com> wrote:
>> On Sat, Nov 22, 2008 at 2:37 PM, Ciprian Dorin Craciun
>> <ciprian.craciun@gmail.com> wrote:
>>>
>>>    Hello all!
>> SNIP
>>>    So I would conclude that relational stores will not make it for
>>> this use case...
>>
>> I was wondering you guys are having to do all individual inserts or if
>> you can batch some number together into a transaction.  Being able to
>> put > 1 into a single transaction is a huge win for pgsql.
>
>    I'm aware of the performance issues between 1 insert vs x batched
> inserts in one operation / transaction. That is why in the case of
> Postgres I am using COPY <table> FROM STDIN, and using 5k batches...
> (I've tried even 10k, 15k, 25k, 50k, 500k, 1m inserts / batch and no
> improvement...)

I've had exactly the same experience with Postgres during an attempt
to use it as a store for large-scale incoming streams of data at a
rate very comparable to what you're looking at (~100k/sec). We
eventually just ended up rolling our own solution.

--
- David T. Wilson
david.t.wilson@gmail.com

В списке pgsql-general по дате отправления:

Предыдущее
От: "Ciprian Dorin Craciun"
Дата:
Сообщение: Re: Using Postgres to store high volume streams of sensor readings
Следующее
От: Alvaro Herrera
Дата:
Сообщение: Re: Using Postgres to store high volume streams of sensor readings