And here is the function source code (inspired in codes I found in src/backend/utils/adt/int.c):
PG_FUNCTION_INFO_V1(pg_calculate_hash);
Datum
pg_calculate_hash(PG_FUNCTION_ARGS)
{
int2vector *int2Array = (int2vector *) PG_GETARG_POINTER(0);
const int qtd = int2Array->dim1;
elog(DEBUG1, "pg_calculate_hash(qtd=%d)", qtd);
elog(DEBUG2, " [ndim=%d, dataoffset=%d, elemtype=%d, dim1=%d, lbound1=%d]",
int2Array->ndim, int2Array->dataoffset, int2Array->elemtype, int2Array->dim1, int2Array->lbound1);
[...]
}
In order to test it against a table structure, I executed these instructions on psql:
db=# insert into ss values ('[0:5]={58,17,15,36,59,54}');
INSERT 0 1
db=# select * from ss;
s
---------------------------
[0:5]={58,17,15,36,59,54}
(1 row)
Then, whenever calling the function passing the int2[] column directly, strange values are read into the "int2vector" object:
db=# set client_min_messages to debug2;
SET
db=# select s, calculate_hash(s) from ss;
DEBUG: pg_calculate_hash(qtd=0)
DEBUG: [ndim=0, dataoffset=5376, elemtype=1536, dim1=0, lbound1=285227520]
s | calculate_hash
---------------------------+---------------
[0:5]={58,17,15,36,59,54} | 0
(1 row)
On the other hand, when I double-cast the int2[] column value, it works as expected (reading the proper "int2vector" structure):
db=# select s, calculate_hash(s::varchar::int2[]) from ss;
DEBUG: pg_calculate_hash(qtd=6)
DEBUG: [ndim=1, dataoffset=0, elemtype=21, dim1=6, lbound1=0]
s | calculate_hash
---------------------------+--------------------
[0:5]={58,17,15,36,59,54} | 441352797842128896
(1 row)
Please, what is wrong with that function code?
Thanks in advance.
The whole project is on GitHub: