Обсуждение: Need help doing a CSV import

Поиск
Список
Период
Сортировка

Need help doing a CSV import

От
tony@exquisiteimages.com
Дата:
I am in the process of moving a FoxPro based system to PostgreSQL.

We have several tables that have memo fields which contain carriage
returns and line feeds that I need to preserve. I thought if I converted
these into the appropriate \r and \n codes that they would be imported as
carriage returns and line feeds, but instead they are stored in the
database as \r and \n.

The command I am using is:

copy main.company
  from E'f:\company.csv"
  DELIMITER "|"
  CSV HEADER QUOTE '"' ESCAPE E'\\';

The command imports all of the data, but I want the escaped codes to be
expanded to their actual values.

Can anyone assist me with this?


Re: Need help doing a CSV import

От
Craig Ringer
Дата:
On 14/07/2010 7:04 PM, tony@exquisiteimages.com wrote:
> I am in the process of moving a FoxPro based system to PostgreSQL.
>
> We have several tables that have memo fields which contain carriage
> returns and line feeds that I need to preserve. I thought if I converted
> these into the appropriate \r and \n codes that they would be imported as
> carriage returns and line feeds, but instead they are stored in the
> database as \r and \n.

PostgreSQL doesn't process escapes in CSV import mode.

You can reformat the data into the non-csv COPY format, which WILL
process escapes. Or you can post-process it after import to expand them.
Unfortunately PostgreSQL doesn't offer an option to process escapes when
"CSV" mode COPY is requested.

I posted a little Python script that reads CSV data and spits out
COPY-friendly output a few days ago. It should be trivially adaptable to
your needs, you'd just need to change the input dialect options. See the
archives for the script.

--
Craig Ringer

Re: Need help doing a CSV import

От
Tim Landscheidt
Дата:
Craig Ringer <craig@postnewspapers.com.au> wrote:

>> I am in the process of moving a FoxPro based system to PostgreSQL.

>> We have several tables that have memo fields which contain carriage
>> returns and line feeds that I need to preserve. I thought if I converted
>> these into the appropriate \r and \n codes that they would be imported as
>> carriage returns and line feeds, but instead they are stored in the
>> database as \r and \n.

> PostgreSQL doesn't process escapes in CSV import mode.

> You can reformat the data into the non-csv COPY format,
> which WILL process escapes. Or you can post-process it after
> import to expand them. Unfortunately PostgreSQL doesn't
> offer an option to process escapes when "CSV" mode COPY is
> requested.

> I posted a little Python script that reads CSV data and
> spits out COPY-friendly output a few days ago. It should be
> trivially adaptable to your needs, you'd just need to change
> the input dialect options. See the archives for the script.

Another option is a small Perl script or something similar
that connects to both the FoxPro and the PostgreSQL database
and transfers the data with parameterized "INSERT". The ad-
vantage of this is that you have tight control of charsets,
date formats, EOL conventions & Co. and do not have to won-
der whether this and that file is in this and that stage of
the conversion process, the disadvantage is obviously that
you lose any speed benefit of bulk "COPY".

Tim

Re: Need help doing a CSV import

От
David Fetter
Дата:
On Wed, Jul 14, 2010 at 01:20:25PM +0000, Tim Landscheidt wrote:
> Craig Ringer <craig@postnewspapers.com.au> wrote:
>
> >> I am in the process of moving a FoxPro based system to PostgreSQL.
>
> >> We have several tables that have memo fields which contain carriage
> >> returns and line feeds that I need to preserve. I thought if I converted
> >> these into the appropriate \r and \n codes that they would be imported as
> >> carriage returns and line feeds, but instead they are stored in the
> >> database as \r and \n.
>
> > PostgreSQL doesn't process escapes in CSV import mode.
>
> > You can reformat the data into the non-csv COPY format,
> > which WILL process escapes. Or you can post-process it after
> > import to expand them. Unfortunately PostgreSQL doesn't
> > offer an option to process escapes when "CSV" mode COPY is
> > requested.
>
> > I posted a little Python script that reads CSV data and
> > spits out COPY-friendly output a few days ago. It should be
> > trivially adaptable to your needs, you'd just need to change
> > the input dialect options. See the archives for the script.
>
> Another option is a small Perl script or something similar
> that connects to both the FoxPro and the PostgreSQL database
> and transfers the data with parameterized "INSERT". The ad-
> vantage of this is that you have tight control of charsets,
> date formats, EOL conventions & Co. and do not have to won-
> der whether this and that file is in this and that stage of
> the conversion process, the disadvantage is obviously that
> you lose any speed benefit of bulk "COPY".

You can do your transformations and hand the stream off to the COPY
interface.  See the pg_putcopydata() section of the DBD::Pg manual for
examples. :)

Cheers,
David.
--
David Fetter <david@fetter.org> http://fetter.org/
Phone: +1 415 235 3778  AIM: dfetter666  Yahoo!: dfetter
Skype: davidfetter      XMPP: david.fetter@gmail.com
iCal: webcal://www.tripit.com/feed/ical/people/david74/tripit.ics

Remember to vote!
Consider donating to Postgres: http://www.postgresql.org/about/donate

Re: Need help doing a CSV import

От
Tim Landscheidt
Дата:
David Fetter <david@fetter.org> wrote:

> [...]
>> Another option is a small Perl script or something similar
>> that connects to both the FoxPro and the PostgreSQL database
>> and transfers the data with parameterized "INSERT". The ad-
>> vantage of this is that you have tight control of charsets,
>> date formats, EOL conventions & Co. and do not have to won-
>> der whether this and that file is in this and that stage of
>> the conversion process, the disadvantage is obviously that
>> you lose any speed benefit of bulk "COPY".

> You can do your transformations and hand the stream off to the COPY
> interface.  See the pg_putcopydata() section of the DBD::Pg manual for
> examples. :)

Eh, yes, but then you have to do all the escaping yourself
and the simplicity of "get values A, B, C from this connec-
tion and pass it onto that" goes away :-). Now if there'd be
a "pg_putcopydata(array of arrayrefs)" ... :-).

Tim