Re: is postgres a good solution for billion record data

Поиск
Список
Период
Сортировка
От Scott Marlowe
Тема Re: is postgres a good solution for billion record data
Дата
Msg-id dcc563d10910241343t5a570afbv78996e2a03939fdc@mail.gmail.com
обсуждение исходный текст
Ответ на is postgres a good solution for billion record data  (shahrzad khorrami <shahrzad.khorrami@gmail.com>)
Ответы Re: is postgres a good solution for billion record data  (Scott Marlowe <scott.marlowe@gmail.com>)
Список pgsql-general
On Sat, Oct 24, 2009 at 7:32 AM, shahrzad khorrami
<shahrzad.khorrami@gmail.com> wrote:
> is postgres a good solution for billion record data, think of 300kb data
> insert into db at each minutes, I'm coding with php
> what do you recommend to manage these data?

You'll want a server with LOTS of hard drives spinning under it.  Fast
RAID controller with battery backed RAM.  Inserting the data is no
problem. 300kb a minute is nothing.  My stats machine that handles
about 2.5M rows a day during the week is inserting in the megabytes
per second (it's also the search database so there's the indexer wtih
16 threads hitting it).  The stats part of the load is miniscule until
you start retrieving large chunks of data, then it's mostly sequential
reads in the 100+Megs a second.

The more drives and the better the RAID controller you throw at the
problem the better performance you'll get.  For the price of one
oracle license for one core, you can build a damned find pgsql server
or pair of servers.

В списке pgsql-general по дате отправления:

Предыдущее
От: Scott Marlowe
Дата:
Сообщение: Re: is postgres a good solution for billion record data.. what about mySQL?
Следующее
От: Bruno Baguette
Дата:
Сообщение: Re: How can I get one OLD.* field in a dynamic query inside a trigger function ?