Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Usenet -> c.d.o.server -> Re: CBO optimizer

Re: CBO optimizer

From: Niall Litchfield <>
Date: Thu, 2 Dec 2004 22:40:08 -0000
Message-ID: <41af99c2$0$19154$>

"qwerty" <> wrote in message news:conuif$l3b$
> Hi,
> i'm working with a 9.2 db server, on this server we have a small number of
> static tables (i.e. containing configuration values) and a large number of
> dynamic tables.
> Such dynamic tables may be very small (less than 100 of records) and some
> minutes later can be large (more than 100.000 of rercords) and than again
> can become small and so on....
> With this kind of tables, i found very difficult work with the CBO
> optimization, infact, if the analyze (using package dbms_stats) is running
> while the tables are empty, the CBO will use a Full scan table, obviously
> this is not good when the table are full of data (more than 100.000 of
> records).
> I know that oracle suggest to calculate statistics often and when is
> known that the data has changed a lot, but with my db, those statistcs
> take more than 1hour to run (calculating them only on the dynamic
> tables).. which is too much... I cannot wait such time before let the
> application work ...
> I'm facing with different solutions (i cannot estimate statistics when the
> application loads the table, because i cannot change its code):
> 1) estimate the statistics in order to reduce the time of execution and
> executing them very often, this means sometime the application perform
> well and sometime no.
> 2) compute statistcs only when the table is full of data, one time, and
> never (or rarely) recalculate them, this means that the application can
> perform well if the actual data is quite "similar" to the data on which
> the statistic has been calculated, this method should probably let a
> better solution.
> Are there other solutions to this problem???, can you give me suggestions
> ???

I have a question, a suggestion and a comment.

The question is, do your tables really change in number of rows drastically downwards as well as upwards? As a sweeping generalisation large tables tend to stay large and increase in size and small tables tend to stay small. The behaviour you describe would in my experience be a little unusual.

The suggestion is, you know that you can set stats using dbms_stats? If so, then your second suggestion seems likely to make sense. calculate the stats when data is 'worst' for performance and then stick with that. Get your worst performing time to perform adequately, and the chances are (though oddities do arise) that lighter load times will be not as good as they could be , but still OK as well.

The comment. I don't see anywhere that Oracle suggest 'calculating statistics often'.

Niall Litchfield
Oracle DBA 
Received on Thu Dec 02 2004 - 16:40:08 CST

Original text of this message