Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: partitioning

Re: partitioning

From: Howard J. Rogers <howardjr2000_at_yahoo.com.au>
Date: Wed, 30 Oct 2002 05:57:06 +1100
Message-ID: <WJAv9.65255$g9.183272@newsfeeds.bigpond.com>

"Pablo Sanchez" <pablo_at_dev.null> wrote in message news:Xns92B661070A14Cpingottpingottbah_at_209.189.89.243...
> "Peter" <depend3_at_yahoo.com> wrote in
> news:wtyv9.184779$8o4.29325_at_afrodite.telenet-ops.be:
>
> > I know companies that never use CBO, just because statistics slow
> > down performance if they aren't taken regularly. If only one table
> > of a user has statistics taken from it, it grately reduces speed of
> > all other tables queries under RBO. In general an oracle database
> > can perform very well without statistics and CBO. The example I talk
> > about is a system with tables less than 100000 rows some with
> > queries joining tables on date basis. I saw a query improve from
> > seven seconds to less than 1 or 2 secs just by partitioning the
> > tables and the indexes, without statistics and CBO.
>
> The ironic part is that in your example above, it's an excellent
> candidate for RBO: relatively small datasets so analyzes can be run
> periodically without affecting the online system.
>
> The 'issues' (and I intend that in quotes) with RBO is when your
> datasets span in the giga and terabyte range, if not greater. Some
> RDBMS vendors allow one to output the statistics from one database to
> a file, which can then be imported to another database. The beauty
> here is that if you are mirroring your instances, you can run the
> statistics on your backup, and upload them into your primary.
>
> I expect Oracle to mimic Sybase's behavior relatively soon -- they
> 'borrow' everything else, so why not? <g>

Well, they've been doing it since at least 8.1.6. Check out dbms_stats. There's an export_<xxx>_stats and import_<xxx>_stats procedure. Grab statistics from one table or schema into a 'stats table' using the right export.... procedure, use boring old 'exp' export to pipe the statsistics table into a traditional dump file, transfer the dump file to another database, use 'imp' to import the statistics table into the new database, and use the import_<xxx>_.... procedure to transfer the statistics out of the statistics table into the data dictionary.

Voila.

Even more interestingly, you can use the dbms_stats.SET_<xxx>_statistics procedure to completely make up statistics for objects -to claim, for example, that a 14-row table (mentioning no names) actualy has 14 billion rows in it, and to get the optimiser to behave accordingly.

DBMS_STATS is a seriously sexy package (and I probably need to get a life).

Regards
HJR
> --
> Pablo Sanchez, High-Performance Database Engineering
> http://www.hpdbe.com
Received on Tue Oct 29 2002 - 12:57:06 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US