I'm not entirely sure if I'm understanding you, but if I am then it
seems like transactions would do what you're asking, but batch them at the
backend point rather than the client point:
BEGIN;
INSERT...
INSERT...
INSERT...
COMMIT;
will wait until the COMMIT line to actually change the backend db.
----------------------------------------------------------------------
Andrew J Perrin - andrew_perrin@unc.edu - http://www.unc.edu/~aperrinAssistant Professor of Sociology, U of North
Carolina,Chapel Hill 269 Hamilton Hall, CB#3210, Chapel Hill, NC 27599-3210 USA
On Mon, 11 Feb 2002, charlie wrote:
> I have looked around quite a bit and could not find any information on
> whether or not Postgres supports Oracle-style array usage for bulk
> reading/writing of the database, e.g. bulk insert statements ?
>
> I have a web application --- I'm acquiring/calculating various data
> throughout a user's session that needs to be written to the database. I
> could either just insert/update the data in Postgres as it is computed
> throughout the session, or, I could keep the data in memory (using servlets,
> storing the data at session scope) and then write it all to Postgres when
> the session ends, if there were any advantage to doing so, i.e., if there
> were a way that I could insert the data in bulk to save a bunch of trips
> back and forth to the database. Otherwise, there would seem to be no
> performance difference in whether the web app is executing a bunch of SQL
> statements over the course of the session vs. executing all the same
> statements at the end of the session.
>
> The web application could experience high loads, in terms of the
> number of simultaneous user sessions, which is why I'm concerned about
> performance.
>
> thanks,
> charlie
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 2: you can get off all lists at once with the unregister command
> (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
>