Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Usenet -> c.d.o.misc -> Re: Processing batches, requesting comments with sqlldr and preapred UPDATes.

Re: Processing batches, requesting comments with sqlldr and preapred UPDATes.

From: Brian Tkatch <N/A>
Date: Tue, 23 Oct 2007 16:03:32 -0400
Message-ID: <>

On Tue, 23 Oct 2007 12:02:47 -0500, Brian Peasland <> wrote:

>Brian Tkatch wrote:
>Why do you want to avoid "TABLE LOCKs"? Oracle DML locks only the
>affected rows...and this is a good thing. Any lock on the table just
>stops DDL from occurring on the table...again, a good thing. Oracle
>*does not* lock the table to stop others from reading committed data.
>And Oracle *does not* lock the entire table to stop other DML from

Thank you for the explanation. I obviously misunderstood what a TABLE LOCK was.

In our case, the records curently being INSERTed should not be affected by any new DML just yet. Only prior batches would have their records UPDATEd.

>> The proposed solution would be to load the data into a new TABLE
>> first, then INSERT the data into the current TABLE and DELETE it from
>> this new first TABLE.
>> Originally, i thought that to allow sqlldr to run from different
>> servers, a GLOBAL TEMPORARY TABLE could be used for the first TABLE
>> (and then INSERT it from there INTO the "real" TABLE). But, it seems
>> that sqlldr cannot use this approach due to the COMMIT at the end.
>Have you looked at External Tables? This is available in Oracle 9i.
>External Tables use the same SQL*Loader engine so you get the same
>speed. And you can do a simple INSERT..SELECT to read the External
>Table's contents and populate your (internal) table.

No, i have not. I think i shall do so right now though. I appreciate the hint.

>> 2) After the data is INSERTed, an external process SELECTs a COLUMN
>> FROM the TABLE and queries another (non-DB) server for more
>> information (which, i have been told, policy states we are not
>> supposed to cache), which then uses the data in a conditional UPDATE
>> statement. This statement is executed once per unique data in the
>> present batches.
>Any block of data read by any user in an Oracle database must be stored
>in the Buffer Cache. You cannot avoid this. Or do you mean you can't
>"cache" the data by storing it in a local table?

Yes, i mean the latter.

>> I was wondering about saving trips to the DB by making one large
>> UPDATE with a CASE statement listing the tens or hundreds of different
>> values, or perhaps by CREATEing a PROCEDURE to PREPARE an INSERT, get
>> passed an array and do a FOR ALL or the like. Currently, the java code
>> uses its own INSERT, once per new value.
>> Any thoughts?
>> B.
> External Table may be able to help you here. You can craft
>any SQL statement against the External Table. So you can use CASE
>statements here too. Why not write a SELECT statement which reads data
>from the External Table and uses information in the "(non-DB) server".
>This SELECT statement can be used as a basis for your INSERT..SELECT to
>push the final results into your table.

Thanx. I will have to read about this in the documentation. Currently, i do not know what External TABLEs are.

B. Received on Tue Oct 23 2007 - 15:03:32 CDT

Original text of this message