Re: Storing large large JSON objects in JSONB

Поиск
Список
Период
Сортировка
От Laurenz Albe
Тема Re: Storing large large JSON objects in JSONB
Дата
Msg-id 2b5df78128c02a4389f4fb9a05c0fd8084583c6a.camel@cybertec.at
обсуждение исходный текст
Ответ на Storing large large JSON objects in JSONB  (Wells Oliver <wells.oliver@gmail.com>)
Список pgsql-admin
On Sat, 2022-07-09 at 12:37 -0700, Wells Oliver wrote:
> We have a couple of tables where we like to store very large JSON objects
> in a JSONB column. We don't use these in, err, real time kind of situations,
> intensive queries, etc, so I'm not concerned about their performance
> necessarily, it's more just to store vendor data that's parsed as we go into
> smaller more usable tables, and so they serve as the historical record,
> and occasionally we re-process.
> 
> I'm wondering if there's some smarter solution with Postgres, these usually
> end up in TOAST tables as I understand it, so the actual table is quite small
> and then there's some associated TOAST thing that grows and grows (the given
> one right now is 500GB).
> 
> I don't.. really... care, I guess, that the TOAST tables are that large,
> but I thought I might do the diligence here and ask if there's some different
> solution, or if this is fine, or some kind of ticking time bomb I am
> understanding poorly.
> 
> We like these in the database as we have SQL scripts to parse them, so saving
> them as files in S3 and rewriting ETL in python or whatever kind sounds awful.

I know of no problem this could cause, other than that if your database grows
very large, it may be a pain to backup.

Yours,
Laurenz Albe
-- 
Cybertec | https://www.cybertec-postgresql.com



В списке pgsql-admin по дате отправления:

Предыдущее
От: Wells Oliver
Дата:
Сообщение: Storing large large JSON objects in JSONB
Следующее
От: Raj kumar
Дата:
Сообщение: Improve "select count(*)" query - takes more than 30 mins for some large tables