Re: Tom Kyte's book obsolescence

From: Charles Hooper <hooperc2000_at_yahoo.com>
Date: Fri, 1 Feb 2008 11:02:01 -0800 (PST)
Message-ID: <5df964c0-e704-482e-af51-4353dc5dee4b@f10g2000hsf.googlegroups.com>


On Feb 1, 10:02 am, Mladen Gogala <mgog..._at_yahoo.com> wrote:
> This is a longer post and I beg your forgiveness for that.
>
> Recently, I've had a complaint that analytic function is not executing
> very fast. The Oracle versions involved were 10.2.0.3 and  11.1.0.6. I
> wrote the same query as a self-join and, to my infinite surprise, the
> query was running faster then with the analytic functions. I remembered a
> book that says it shouldn't be that way. The book was "Effective Oracle
> by Design", by Tom Kyte. I created the structures, ran the queries and the
> book was wrong on versions 10g and 11g. It was correct on the version 9i.
>
> In other words, in the Tom Kyte's book "Effective Oracle By Design", the
> 2nd query on the page 517 is executing faster then the first one, the one
> with the analytic functions. Plans look very similar to what is in the
> book but the query that Tom claims should execute faster now executes
> slower.
>
> This discovery is loaded with implications:
> 1) Oracle does I/O in versions 10g and 11g in the different way then it
>    used to in versions 9i and below. I am not sure what exactly is the
>    difference, but there is a difference.
> 2) Analytic functions are now just a neat trick, no longer a performance
>    improvement. The cost of the window buffer sort is relatively high
>    compared to the cost of simple join.
> 3) Some information in "Effective Oracle by Design" is obsolete. How much,
>    exactly?
>
> Below is the script needed to set up the infamous BIG_TABLE used in all of
> the queries. I hope Tom Kyte will not mind and I hope he will read this.
> He used to follow this group once upon a time, in a version far, far away.
>
> create table bigtab
>           as
>           select rownum id, a.*
>   from all_objects a
>            where 1=0;
>           alter table bigtab nologging;          
> -- Fill 1'000'000 Rows into            the Table
>     declare
>     l_cnt  number;
>     l_rows number := 1000000;
>           begin
>     -- Copy ALL_OBJECTS
>     insert /*+ append */
>     into bigtab
>     select rownum, a.*
>       from all_objects a;
>     l_cnt := sql%rowcount;
>     commit;
>
>     -- Generate Rows
>     while (l_cnt < l_rows)
>     loop
>         insert /*+ APPEND */ into bigtab
>         select rownum+l_cnt,
>                          OWNER, OBJECT_NAME, SUBOBJECT_NAME,
>                          OBJECT_ID, DATA_OBJECT_ID,
>                          OBJECT_TYPE, CREATED, LAST_DDL_TIME,
>                          TIMESTAMP, STATUS, TEMPORARY,
>                          GENERATED, SECONDARY
>           from bigtab
>          where rownum <= l_rows-l_cnt;
>         l_cnt := l_cnt + sql%rowcount;
>         commit;
>     end loop;
>           end;
>           /
>
>           alter table bigtab add constraint
>           bigtab_pk primary key(id);
>
> --
> Mladen Gogala

Mladen, can you post one of analytical and non-analytical SQL statements that use the bigtab table. I have the books "Expert One-on- One" and "Expert Oracle Database Architecture", but neither includes SQL statements that reference the bigtab table.

Charles Hooper
IT Manager/Oracle DBA
K&M Machine-Fabricating, Inc. Received on Fri Feb 01 2008 - 13:02:01 CST

Original text of this message