Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Improving the speed of forall ... insert with many columns

Re: Improving the speed of forall ... insert with many columns

From: Richard Kuhler <noone_at_nowhere.com>
Date: Fri, 17 May 2002 21:08:01 GMT
Message-ID: <RieF8.9199$V71.3189013@twister.socal.rr.com>


First, you will never match the performance of the "insert /*+append*/" (especially if the table is nologging) unless you use an equivalent direct path. OCI has a direct path interface but I'd suggest you read up on the requirements and implications of using it before doing so...

http://technet.oracle.com/docs/products/oracle8i/doc_library/817_doc/appdev.817/a76975/oci09adv.htm#444645

Baring direct path, the pl/sql bulk forall should be your fast choice. However, I would suggest you run a more realistic benchmark. It appears you are doing 400 calls to insertHugeTest with 1000 rows each. Why not simulate that completely in pl/sql?

for i in 1 .. 400 end loop

    insertHugeTest(<1000 rows>);
loop

Now, 42s - that time is a more accurate measure of the external overhead. If that number is low then look at tuning the database accordingly (maybe start by getting timed statistics of the wait events). If that number is high then you'll need to re-measure your network/application/other overhead to see where the problem is. I suspect your network overhead estimate is significantly off. Where do you account for roundtrip latency, message overhead, etc?

Richard

"Sven C. Koehler" wrote:
>
> Hello all!
>
> I am using the Oracle OCI8 interface (namely OCIBindByPos(), and
> OCIBindArrayOfStruct()) to insert (append) data sets of 400,000 rows each
> into a table with 19 columns (the table has no indexes and no
> constrains). On the server side I use a PL/SQL procedure (PL/SQL source
> is appended to the end of this message) that takes 19 array parameters
> corresponding to each column of the table, and inserts them all using
> forall insert. When I run my example application it takes about 42s to
> insert the 400,000 rows (IMHO this is too slow).
>
> I measured/calculated the total time to consist of the following
> time portions:
> - Preparing the c-arrays bound with OCIBindByPos(): <4s;
> - Transferring the data over the network:
> 127 B/row * 400000 rows = ~48MB
> over an otherwise idle 100MBit network
> transferring about 8MB/s: 6s;
>
> So the total time spent on the database is about 32s (42s-10s).
> I also experimented with the size of the chunks that I transfer
> each time with OCIStmtExecute(), ranging from 200 to 5000 rows
> for each chunk and settled at 1000 rows (which gave me the 42s).
>
> I tried to improve the performance further by changing the
> extents of the table, and setting NOLOGGING. However, the 42s
> stayed.
>
> OTOH I can see improvements when I make a schema copy of the
> table and do an
> "insert /*+APPEND*/ into tablecopy select * from origtable".
> This takes about 6s for the 400,000 rows.
>
> So, I assume that my PL/SQL code took much time, but I don't know
> yet a better way than using the "forall ... insert .." way.
>
> I greatly appreciate any hints that further improve the
> performance!
>
> Best regards,
>
> Sven C. Koehler
>
> (schween snafu de)
> #########################################################################
> The PL/SQL code I used:
>
> drop table hugetest;
>
> create table hugetest (
> one NUMBER(10),
> two VARCHAR2(30),
> three NUMBER(10),
> four NUMBER(11,1),
> five NUMBER(11,1),
> six NUMBER(11,1),
> seven NUMBER(11,1),
> eight NUMBER(10),
> nine NUMBER(10),
> ten VARCHAR2(1),
> eleven VARCHAR2(1),
> twelve VARCHAR2(1),
> thirteen VARCHAR2(1),
> fourteen NUMBER(16,6),
> fifteen NUMBER(16,6),
> sixteen VARCHAR2(1),
> seventeen VARCHAR2(1),
> eighteen VARCHAR2(1),
> nineteen VARCHAR2(1)
> ) storage (initial 50M next 5m pctincrease 0) tablespace huge;
>
> create or replace package tableHugeTest as
> type number_10Type is table of NUMBER(10) index by binary_integer;
> type number_11_1Type is table of NUMBER(11,1) index by binary_integer;
> type number_16_6Type is table of NUMBER(16,6) index by binary_integer;
> type varchar2_1Type is table of VARCHAR2(1) index by binary_integer;
> type varchar2_30Type is table of VARCHAR2(30) index by binary_integer;
>
> procedure insertHugeTest(p_one in number_10Type,
> p_two in varchar2_30Type, p_three in number_10Type,
> p_four in number_11_1Type, p_five in number_11_1Type,
> p_six in number_11_1Type, p_seven in number_11_1Type,
> p_eight in number_10Type, p_nine in number_10Type,
> p_ten in varchar2_1Type, p_eleven in varchar2_1Type,
> p_twelve in varchar2_1Type, p_thirteen in varchar2_1Type,
> p_fourteen in number_16_6Type, p_fifteen in number_16_6Type,
> p_sixteen in varchar2_1Type, p_seventeen in varchar2_1Type,
> p_eighteen in varchar2_1Type, p_nineteen in varchar2_1Type);
> end;
> /
>
> create or replace package body tableHugeTest as
> procedure insertHugeTest(p_one in number_10Type,
> p_two in varchar2_30Type, p_three in number_10Type,
> p_four in number_11_1Type, p_five in number_11_1Type,
> p_six in number_11_1Type, p_seven in number_11_1Type,
> p_eight in number_10Type, p_nine in number_10Type,
> p_ten in varchar2_1Type, p_eleven in varchar2_1Type,
> p_twelve in varchar2_1Type, p_thirteen in varchar2_1Type,
> p_fourteen in number_16_6Type, p_fifteen in number_16_6Type,
> p_sixteen in varchar2_1Type, p_seventeen in varchar2_1Type,
> p_eighteen in varchar2_1Type, p_nineteen in varchar2_1Type) is
> begin
> if (p_one.first <= p_one.last) then
> forall i in p_one.first .. p_one.last
> insert into hugetest values (
> p_one(i), p_two(i), p_three(i), p_four(i),
> p_five(i), p_six(i), p_seven(i), p_eight(i),
> p_nine(i), p_ten(i), p_eleven(i), p_twelve(i),
> p_thirteen(i), p_fourteen(i), p_fifteen(i),
> p_sixteen(i), p_seventeen(i), p_eighteen(i),
> p_nineteen(i));
> commit;
> end if;
> end;
> end;
> /
>
> show errors;
Received on Fri May 17 2002 - 16:08:01 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US