Re: Selecting RAM and CPU based on max_connections

Поиск
Список
Период
Сортировка
От Laurenz Albe
Тема Re: Selecting RAM and CPU based on max_connections
Дата
Msg-id 6e12b9019eb9ccde5e90af0c36abee38b2ca846b.camel@cybertec.at
обсуждение исходный текст
Ответ на Re: Selecting RAM and CPU based on max_connections  (Andreas Kretschmer <andreas@a-kretschmer.de>)
Список pgsql-performance
On Fri, 2022-05-20 at 12:15 +0200, Andreas Kretschmer wrote:
> On 20 May 2022 10:27:50 CEST, aditya desai <admad123@gmail.com> wrote:
> > One of our applications needs 3000 max_connections to the database.
> > Connection pooler like pgbouncer or pgpool is not certified within the
> > organization yet. So they are looking for setting up high configuration
> > Hardware with CPU and Memory. Can someone advise how much memory and CPU
> > they will need if they want max_conenction value=3000.
> 
> Pgbouncer would be the best solution. CPU: number of concurrent connections.
> RAM: shared_buffer + max_connections * work_mem + maintenance_mem + operating system + ...

Right.  And then hope and pray that a) the database doesn't get overloaded
and b) you don't hit any of the database-internal bottlenecks caused by many
connections.

I also got the feeling that the Linux kernel's memory accounting somehow lags.
I have seen cases where every snapshot of "pg_stat_activity" I took showed
only a few active connections (but each time different ones), but the
amount of allocated memory exceeded what the currently active sessions could
consume.  I may have made a mistake, and I have no reproducer, but I would
be curious to know if there is an explanation for that.
(I am aware that "top" shows shared buffers multiple times).

Yours,
Laurenz Albe



В списке pgsql-performance по дате отправления:

Предыдущее
От: Andreas Kretschmer
Дата:
Сообщение: Re: Selecting RAM and CPU based on max_connections
Следующее
От: Laurenz Albe
Дата:
Сообщение: Re: Need help on Query Tunning and Not using the Index Scan