Re: [HELP]: Large cursor updates

From: Sean Stasica <Sean_Stasica_at_stortek.com>
Date: Mon, 3 Jan 1994 16:05:15 GMT
Message-ID: <CJ2A0s.KpD_at_stortek.com>


In article 16381_at_KYnug.org, neil_at_kynug.org (Neil Greene) writes:
> I have a cursor that scans a table of approximatly 10,000 records and does
> various inserts to a table based on what it finds. When the same cursor
> is run on a select statement that finds a smaller selection of records,
> say 100-1,000, everything seems to complete quickly and properly.
> However, cursors that have a larger found set appear to hang.
>
> What can be causing this? What should I be looking for? What is causing
> the cursor the hang or appear not to complete? Are there some dba init
> parameters I should be setting to optimize the cursor?

I assume you are using Forms because you mentioned cursors.

One thing you may want to look at is increasing the buffer size in your block definition. What that does is hold all the records fetched from the query into memory, instead of buffering some of them on disk. If your table is 10,000 records, make that a starting buffer size. There is another parameter called Array Size, but that deals with base table blocks.

I also have had problems with cursor speed. What I'm doing is loading data into a block via a cursor and copy() statements. It is VERY slow, but the sql statement itself is fairly quick. I tried increasing buffer size and everything else but never have achieved very good results. Does anyone else have any good ideas on what else to look at?

  • - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Sean_Stasica_at_stortek.com
  • - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Received on Mon Jan 03 1994 - 17:05:15 CET

Original text of this message