Issue with Running VACUUM on Database with Large Tables

Поиск
Список
Период
Сортировка
От Nagaraj Raj
Тема Issue with Running VACUUM on Database with Large Tables
Дата
Msg-id 1237927313.5086260.1703506240368@mail.yahoo.com
обсуждение исходный текст
Ответы Re: Issue with Running VACUUM on Database with Large Tables  (Tom Lane <tgl@sss.pgh.pa.us>)
Список pgsql-bugs

Hello,

While executing a vacuum analyze on our database containing large tables (approximately 200k), I encountered an issue. If a table gets dropped during the vacuum process, the vacuum job fails at that point with an error message stating "OID relation is not found" and exits. This behavior seems to interrupt the entire process without providing a warning or handling the error gracefully.

Considering the possibility of dynamic objects within our database design, this abrupt termination makes it challenging to complete the vacuum process successfully. This issue has persisted across multiple versions, including the current version we're using (14.8).

Is this behavior expected or could it possibly be a bug? It would be beneficial to have a mechanism in place to handle such instances, perhaps by providing a notice or warning when encountering dropped tables, allowing the process to skip those tables and continue with the rest of the vacuum analyze.

Your support in resolving this matter is greatly appreciated.


Thanks,

Rj

В списке pgsql-bugs по дате отправления:

Предыдущее
От: Michael Paquier
Дата:
Сообщение: Re: BUG #18240: Undefined behaviour in cash_mul_flt8() and friends
Следующее
От: Tom Lane
Дата:
Сообщение: Re: Issue with Running VACUUM on Database with Large Tables