Re: Cursors, rollback and Long Updates
Date: 1995/07/31
Message-ID: <3vj3a4$urs_at_dartvax.dartmouth.edu>#1/1
Simon,
I recently had a similar problem. This is how a solved it:
- Use SQL*Loader, or equivalent to load the data into a "temporary" table. Used a SQL*Loader option to assign a record number to each row. You might have to change the specifics if you're not using SQL*Loader, but the idea is the same.
- Create an unique index on the rownum column.
- The following is within PL/SQL on the server: Open a cursor on the temp table (ordered by rownum).
- After each row is processed, record the rownum just processed in a local variable.
- Every x transactions commit the data, and, in a separate activity log table, record the last rownum processed by the program. I also use the activity log table to record timings, and monitor progress of the stored procedure, for performance tuning.
If the stored proc fails in the middle, look at the activity log to figure out at what row to restart.
This approach is about 33% faster than the approach you outlined. ON an Alpha system running OSF/1 and Oracle 7.1.4, I saw an increase from 5,100 rows to 6,500+ input rows processed per minute.
I saw far more dramatic improvements by structuring my input files, using Unix utilities, so that I could use a single statement Insert ...select, instead of having to open a curosr. Even the cursor equivalent of an insert with a subquery took more than twice as long. I don't know if you can similarly restructure your processing.
Hope this helps,
Marty Himmelstein
marty.himmelstein_at_valley.net
Received on Mon Jul 31 1995 - 00:00:00 CEST