Query with large number of joins

Поиск
Список
Период
Сортировка
От Marco Di Cesare
Тема Query with large number of joins
Дата
Msg-id e187bede699941a4b3fe6781a69ecd14@CO2PR0701MB759.namprd07.prod.outlook.com
обсуждение исходный текст
Ответы Re: Query with large number of joins  (Tom Lane <tgl@sss.pgh.pa.us>)
Re: Query with large number of joins  (Merlin Moncure <mmoncure@gmail.com>)
Список pgsql-performance

We are using Postgres for the first time after being SQLServer users for a long time so forgive for being noobs.

 

We are using a BI tool that generates a query with an unusually large number of joins. My understanding is that with this many joins Postgres query planner can't possibly use an exhaustive search so it drops into a heuristics algorithm. Unfortunately, the query runs quite slow (~35 seconds) and seems to ignore using primary keys and indexes where available. 

 

Query plan here (sorry had to anonymize):

http://explain.depesz.com/s/Uml

 

Line 30 is one of the pain points where a full table scan is running on 4.2 million rows even though there are indexes on oscar_bravo.foxtrot_four and oscar_charlie.foxtrot_four

 

We've tried to play around with the join_collapse_limit value by upping it from the default of 8 to 10 or 12 but it doesn't seem to help much. Cranking the value up to an unreasonable value of 20 does shave some seconds off the query time but not substantially (explain plan with the value set to 20: http://explain.depesz.com/s/sW6).

 

We haven't tried playing around with the geqo_threshold at this point.

 

Any thoughts on ways to speed up the run time of this query or any other Postgres settings we should be aware of when dealing with this unusually large number of joins?

 

Thanks in advance

 

 

 

Marco Di Cesare

 

В списке pgsql-performance по дате отправления:

Предыдущее
От: Tom Lane
Дата:
Сообщение: Re: IS NOT NULL and LEFT JOIN
Следующее
От: Laurent Martelli
Дата:
Сообщение: Re: IS NOT NULL and LEFT JOIN