Hi Sean.
Thanks for the reply.
I have one more question. While extracting data using COPT TO command (in TEXT mode) we have an API, PQunescapeBytea,
toconvert string representation of binary data(bytea) into binary. Similary we need to convert binary data into it's
stringrepresentation while loading data into PostgreSQL ,using COPY FROM command in TEXT mode, with PQputCopydata. But
thereis no API to convert binary data into it's string representation. Could you tell me API or I am misinterpreting
something.
Thanks in advance,
Sandeep
----- Original Message ----
From: Sean Davis <sdavis2@mail.nih.gov>
To: pgsql-interfaces@postgresql.org
Cc: Sandeep Khandelwal <sandeep_khandelwal27@yahoo.com>
Sent: Monday, October 16, 2006 4:04:45 PM
Subject: Re: [INTERFACES] Bulk Load and Extract from PostgreSQL
On Monday 16 October 2006 03:07, Sandeep Khandelwal wrote:
> Hi All.
>
> I want to extract and Load data from PostgreSQL using Libpq C API. Please
> let me know which approach will be good to load large number of rows into
> PostgreSQL(Insert or COPY FROM) and, which approach will be good to extract
> large number of rows from PostgreSQL (COPY TO or SELECT). I want to handle
> all the data types supported in the PostgreSQL.
copy is the faster way to go for a single table.
Sean
---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to choose an index scan if your joining column's
datatypesdo not match