============================================================================
POSTGRESQL BUG REPORT TEMPLATE
============================================================================
Your name : Palle Girgensohn
Your email address : girgen@partitur.se
System Configuration
---------------------
Architecture (example: Intel Pentium) : Intel Pentium
Operating System (example: Linux 2.0.26 ELF) : FreeBSD 3.3 Release
PostgreSQL version (example: PostgreSQL-6.5.2): PostgreSQL-6.5.2
Compiler used (example: gcc 2.8.0) : gcc-2.7.2.3
Please enter a FULL description of your problem:
------------------------------------------------
In getting around the fact that postgres limits the tuple size to 8192
bytes, I use a "bigtext" table that splits large text chunks into
smaller parts and store them in separate tuples:
Table = bigtext
+------------------------------+-------------------------------+-------+
| Field | Type | Length|
+------------------------------+-------------------------------+-------+
| id | int4 | 4 |
| seqid | int4 | 4 |
| txt | text | var |
+------------------------------+-------------------------------+-------+
When checking, none of the tuples is too big:
foobar=> select id,seqid,length("text") as len from bigtext order by len
desc;
id|seqid| len
----+-----+----
9590| 0|6929
9593| 0|6899
9567| 0|6894
9567| 1|6892
9595| 0|6754
9592| 1|3003
9590| 1|2939
9593| 1|2693
9595| 1|1667
9567| 2| 293
(10 rows)
but still, vacuum analyze fails:
foobar=> vacuum analyze;
ERROR: Tuple is too big: size 13700
normal vacuum is OK:
foobar=> vacuum;
VACUUM
When removing all tuples above size 6500:
foobar=> delete from bigtext where length("text") > 6500;
DELETE 5
foobar=> vacuum analyze;
VACUUM
So, it works now... I thought the limit was 8192 bytes? seems more like
6500?
How come?
I know this 8192 tuple size limit can be raised when compiling, but this
is considered very risky, right?
Please describe a way to repeat the problem. Please try to provide a
concise reproducible example, if at all possible:
----------------------------------------------------------------------
CREATE TABLE "bigtext" (
"id" int4,
"seqid" int4,
"txt" text);
Populate it with some text chunks about 6800 bytes each in txt.
vacuum analyze