Re: best way to write large data-streams quickly?

Поиск
Список
Период
Сортировка
От Jerry Sievers
Тема Re: best way to write large data-streams quickly?
Дата
Msg-id 8737039bm7.fsf@jsievers.enova.com
обсуждение исходный текст
Ответ на Re: best way to write large data-streams quickly?  (Mark Moellering <markmoellering@psyberation.com>)
Список pgsql-general
Mark Moellering <markmoellering@psyberation.com> writes:

<snip>

>
> How long can you run COPY?  I have been looking at it more closely. 
> In some ways, it would be simple just to take data from stdin and
> send it to postgres but can I do that literally 24/7?  I am
> monitoring data feeds that will never stop and I don't know if that
> is how Copy is meant to be used or if I have to let it finish and
> start another one at some point? 

Launch a single copy and pipe data into it for an extended period an/or
bulk is fine but nothing will be visible until the statement is finished
and, if it were run in a transaction block, the block committed.

HTH

>
> Thanks for everyones' help and input!
>
> Mark Moellering
>
>
>
>

--
Jerry Sievers
Postgres DBA/Development Consulting
e: postgres.consulting@comcast.net
p: 312.241.7800


В списке pgsql-general по дате отправления:

Предыдущее
От: Raghavendra Rao J S V
Дата:
Сообщение: Planning to change autovacuum_vacuum_scale_factor value to zero.Please suggest me if any negative impact.
Следующее
От: "Jehan-Guillaume (ioguix) de Rorthais"
Дата:
Сообщение: Re: Postgresql Split Brain: Which one is latest