Hi All-
I am just returning to a psycopg2 project after a break..
We are using 2.4.2 on Linux, python 2.7, postgres 9.1
If I make a query that returns truly large results
that is, both a lot of data per record, and a lot of records
What is the best way to handle that ?
Right now I just declare a cursor, execute() and then
for rec in cursor.fetchall():
.... stuff ..
on the hardware it is on, it is working fine apparently..
but it got me wondering..
fetchone() fetchmany(..) and fetchall()
will all basically pull the data at once, yes?
a server side cursor is the only way to bring data piece by piece, yes?
thanks
-Brian