Storing large large JSON objects in JSONB

Поиск
Список
Период
Сортировка
От Wells Oliver
Тема Storing large large JSON objects in JSONB
Дата
Msg-id CAOC+FBWRavHQWtQRJ_suSRjtDR9PrnDeRm9CWojr2ogYgDA7Ww@mail.gmail.com
обсуждение исходный текст
Ответы Re: Storing large large JSON objects in JSONB  (Laurenz Albe <laurenz.albe@cybertec.at>)
Re: Storing large large JSON objects in JSONB  (Thomas Kellerer <shammat@gmx.net>)
Список pgsql-admin
We have a couple of tables where we like to store very large JSON objects in a JSONB column. We don't use these in, err, real time kind of situations, intensive queries, etc, so I'm not concerned about their performance necessarily, it's more just to store vendor data that's parsed as we go into smaller more usable tables, and so they serve as the historical record, and occasionally we re-process.

I'm wondering if there's some smarter solution with Postgres, these usually end up in TOAST tables as I understand it, so the actual table is quite small and then there's some associated TOAST thing that grows and grows (the given one right now is 500GB).

I don't.. really... care, I guess, that the TOAST tables are that large, but I thought I might do the diligence here and ask if there's some different solution, or if this is fine, or some kind of ticking time bomb I am understanding poorly.

We like these in the database as we have SQL scripts to parse them, so saving them as files in S3 and rewriting ETL in python or whatever kind sounds awful.

Anyway, thanks for any insight I'm missing on this front.

--

В списке pgsql-admin по дате отправления:

Предыдущее
От: Peter Eisentraut
Дата:
Сообщение: Re: Logical replication and pg_dump for out of band synchronization
Следующее
От: Laurenz Albe
Дата:
Сообщение: Re: Storing large large JSON objects in JSONB