Re: Replicating hundreds of thousandw of rows

Поиск
Список
Период
Сортировка
От Simon Riggs
Тема Re: Replicating hundreds of thousandw of rows
Дата
Msg-id CANP8+jJo121812Yq3H4DVAnVEbE9+zpZJBM6rwPnn8dx9p8EHw@mail.gmail.com
обсуждение исходный текст
Ответ на Replicating hundreds of thousandw of rows  (Job <Job@colliniconsulting.it>)
Список pgsql-general
On 25 November 2016 at 06:23, Job <Job@colliniconsulting.it> wrote:
> Hello,
>
> we need to replicate hundreds of thousands of rows (for reporting) between Postgresql Database nodes that are in
differentlocations. 
>
> Actually, we use Rubyrep with Postgresql 8.4.22.

8.4 is now end-of-life. You should move to the latest version.

> It works fine but it is very slow with a massive numbers of rows.
>
> With Postgresql 9.x, are there some ways to replicate (in background, not in real time!), these quantities of data?
> We need a periodical syncronization..,

You have a choice of

* Physical streaming replication, built-in from 9.0+
* Logical streaming replication, partially built in from 9.4+ using pglogical
and
* Logical streaming replication, built in from 10.0+ (not yet released)

Performance is much better than rubyrep

--
Simon Riggs                http://www.2ndQuadrant.com/
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services


В списке pgsql-general по дате отправления:

Предыдущее
От: Job
Дата:
Сообщение: Replicating hundreds of thousandw of rows
Следующее
От: Adrian Klaver
Дата:
Сообщение: Re: pg_am access in simple transaction?