Re: Loading LOTS of data into Oracle

From: <jl34778_at_corp02.d51.lilly.com>
Date: 20 Apr 94 15:25:12 EST
Message-ID: <1994Apr20.152512.1_at_corp02.d51.lilly.com>


In article <766804562.88snx_at_aleytys.pc.my>, philip_at_aleytys.pc.my (Philip Chee) writes:

> In article <766305579snx_at_hkchan.pc.my> hkchan_at_hkchan.jaring.my writes:

>>
>>In article <1994Apr1.213159.16029_at_PacBell.COM> s4xmakm_at_srv.PacBell.COM writes:
>>
>> > We are loading millions of records into an Oracle
>> > database running on HP-UX on an HP 9000/890.
>> > Using ProC Oracle can only handle about 24 records
>> > addition per second.
>> >
>> > What kind of optimization would you recommend?
>> >
>>I am interested in the answer to this question too. I have tried ODBC
>>call level interface for quick insertion. The quickest speed it can go
>>is only 10 records per second. The loading is done to a NetWare 3.11
>>server on 486/66.
>>
>>Short of using load data module directly, is there any other way to
>>speed this up?
> 
> Is there any particular reason you can't use Oracle Data Loader for
> this purpose?

I would also ask 'why not loader?' The latest version of loader has the direct path option that writes blocks of data, rather than rows of data.

You should also specify the initial and next extents on the table appropriately so that you don't run out of extents or spend lots of time allocation new extents.

You might try disabling constraints and dropping indexes, although rebuilding an index on a millions of rows might not be feasible. Anyone have any suggestions on how to do that?

-- 
Bob Swisshelm                | swisshelm_at_Lilly.com     | 317 276 5472
Eli Lilly and Company        | Lilly Corporate Center  | Indianapolis, IN 46285
Received on Wed Apr 20 1994 - 22:25:12 CEST

Original text of this message