>>>> May I ask the proper (fast) way of inserting bulk data and if possible, an example would be very appreciated.
>>>>
>>>> Please note that the data I would like to insert contains arrays (float8[]).
>>>>
>>>> By the way, my environment is Windows 10, PostgreSQL 11 and compiling with Visual Studio 2017.
>>>
>>> That would be COPY:
>>> https://www.postgresql.org/docs/current/sql-copy.html
>>
>> In particular,
>>
>> copy table (col1, col2, ...) from stdin with binary
>
> Hmm, binary requires that you know the binary PostgreSQL representation.
> PostgreSQL might be faster converting text to its internal representation
> than your application program...
>
> The big exception here is of course bytea, where the binary representation
> is definitely smaller and easy to create.
Hi,
yes.
I understood the data is already binary and OP wants to store it binary, but
of course this might not be the case.
>> Also, I suggest transaction bundling (one commit every 1000 records or so),
>> depending on context.
>
> Why that?
> It might be useful if you expect the load to fail in the middle and want
> to be able to restart from there. But normally, doing it all in one
> transaction is simpler.
Yes. That would be the special case with one commit every N records, where
N is the total count :)
I just wanted to make shure OP will not commit after each COPY.
Bye.
Chris.