Обсуждение: prevent duplicate entries
Hello,
i get a csv file which is inserted into the postgresql database.
Inserted the data with a php script.
The problem is the csv file contains not only new records.
The file is running in a loop with data from 3 months.
Has postgresql a separate function to prevent duplicate records?
At time i filter records in php.
regards
Thomas
On Thursday, 29 May 2014 3:20 PM, Thomas Drebert <drebert@web.de> wrote: >Has postgresql a separate function to prevent duplicate records? >At time i filter records in php. you can directly load csv file date on postgres database using pg_bulkload, which has functionality to avoid duplication pg_bulkload : http://pgbulkload.projects.pgfoundry.org/pg_bulkload.html Is this answer to your question? Regards, Amul Sul
>you can directly load *csv file date* on postgres database using pg_bulkload, which has functionality to avoid duplication Sorry, Its csv file DATA not csv file DATE
amulsul wrote > On Thursday, 29 May 2014 3:20 PM, Thomas Drebert < > drebert@ > > wrote: > > >>Has postgresql a separate function to prevent duplicate records? > >>At time i filter records in php. > > you can directly load csv file date on postgres database using > pg_bulkload, which has functionality to avoid duplication > > pg_bulkload : http://pgbulkload.projects.pgfoundry.org/pg_bulkload.html > > Is this answer to your question? > > Regards, > Amul Sul You might find it better to just load the CSV data into a staging table then perform the necessary "INSERT INTO live ... SELECT ... FROM staging" query to migrate only the new data. It likely will not make much sense to accept (say 90%) of your data eating resources generating duplicate key errors. David J. -- View this message in context: http://postgresql.1045698.n5.nabble.com/prevent-duplicate-entries-tp5805373p5805419.html Sent from the PostgreSQL - novice mailing list archive at Nabble.com.