Re: Stale statistics

From: John Hurley <>
Date: Wed, 1 Jul 2009 05:58:42 -0700 (PDT)
Message-ID: <>

On Jun 30, 8:29 am, Nag <> wrote:


> Hi,
> We have OLTP system on 10gR2 with RAC (3 nodes) with lot of active
> tables. Means deletes, inserts and updates. We are only gathering 1%
> sample statistcs. After bringing the system for couple of hours all
> the statistics are becoming stale. All our queries use bind varaibles
> and cursor sharing parameter is set to EXACT.
> Because of this only the first time we do hard parse and the plan is
> set. After that though the statistics are stale, it is still using the
> same plan.
> What kind of impact will be on the system with stale statistcs?
> Any input will be appreciated.

Don't forget about the changes version 10 introduced when new statistics are gathered.

By default now... unless you invoke dbms_stats with NO_INVALIDATE set to FALSE ( a backward weird logic setting if any ever existed ... shouldn't it be INVALIDATE = TRUE ) ... parsed already active SQL statements do not get invalidated right away ... they stay around even though new statistics are in place.

This is a huge change in the behavior of 10 and above but many people seem to still be unaware of it.

If you run dbms_stats without using the NO_INVALIDATE parameter ( ie ... like 99.9 percent of the world ) then you start running into the effects introduced by another ( new with 10 ) undocumented oracle parameter _optimizer_invalidation_period ...

So even if SQL statements logically should be re parsed ... you wait ... a variable length of time depending on how much a specific SQL statement keeps getting issued ( or not ) ... before the new statistics are reflected in the execution plan.

Worth looking at is this

There's a doc id worth looking at in Metalink but geez I am getting login errors right now. Received on Wed Jul 01 2009 - 07:58:42 CDT

Original text of this message