Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.misc -> Processing batches, requesting comments with sqlldr and preapred UPDATes.
Oracle 9.2
We process a few thousand batches nightly, each with a few hundred lines, where the data is stored in a TABLE and an UPDATE is executed using the data from another (non-DB) server.
In the current system the data is INSERTed INTO a TABLE and processed from there by a subsequent process. It is my understanding that if we load directly into that TABLE it will LOCK it for the duration of the LOAD. Because we want to allow the load and the process to run concurrently (on different batches) we would like to avoid TABLE LOCKs.
The proposed solution would be to load the data into a new TABLE first, then INSERT the data into the current TABLE and DELETE it from this new first TABLE.
Originally, i thought that to allow sqlldr to run from different servers, a GLOBAL TEMPORARY TABLE could be used for the first TABLE (and then INSERT it from there INTO the "real" TABLE). But, it seems that sqlldr cannot use this approach due to the COMMIT at the end.
2) After the data is INSERTed, an external process SELECTs a COLUMN FROM the TABLE and queries another (non-DB) server for more information (which, i have been told, policy states we are not supposed to cache), which then uses the data in a conditional UPDATE statement. This statement is executed once per unique data in the present batches.
I was wondering about saving trips to the DB by making one large UPDATE with a CASE statement listing the tens or hundreds of different values, or perhaps by CREATEing a PROCEDURE to PREPARE an INSERT, get passed an array and do a FOR ALL or the like. Currently, the java code uses its own INSERT, once per new value.
Any thoughts?
B. Received on Tue Oct 23 2007 - 10:27:36 CDT