ANALYZE TABLE Questions

From: Charles D. Marcotte <cdm_at_idm>
Date: Wed, 8 Dec 1993 22:21:02 GMT
Message-ID: <CHqM33.770_at_idm.com>


We are going to use statistical optimization in Oracle 7, how often should we analyze our tables?? (analyze table table_name estimate statistics)

Once a week, a month, every night ??

I know it depends on the growth/shrinkage rate of the table, but are there any good general rules to follow ?

Should we re-analyze tables when they grow/shrink by 10% ? 20% ? 50% ?

Once a table is analyzed, will the optimizer makes decisions ONLY on the frozen statistics ? What happens if the actual table triples in size, or an index is dropped ?

If I analyze a table right after its creation (zero rows), and never analyze it again, will this screw up future query optimizations using this table? If I analyze it with just one row, will this also screw up the optimization ?

Can a table's statistics be exported and imported ???

What are the decisions and rules that the statistical optimimizer uses ?? If you can't give out the details, how 'bout some general rules.

        Any help would be greatly appreciated!

                Chuck (cdm_at_idm.com)

Below is an explanation of why we are using statitsical optimization:

We have found that we need to use the statistical optimizer to get proper nested loop plans with many of our queries using distibuted joins. With the rule based optimizer, we would always get merge/sort joins in distributed queries. When the remote table is over 70 Megs, a merge/sort join will try to copy a huge chunk of the remote table into the local table space. It just takes forever... We call situations like these 'Black Holes'. We have found many, many queries in our product taht work fine in a single database that turn into 'Black Holes' when we distribute the tables across 2 or more systems. Received on Wed Dec 08 1993 - 23:21:02 CET

Original text of this message