Re: Estimating effectivity of performance improvement measures
Date: Thu, 27 Nov 2008 02:07:16 -0800 (PST)
Message-ID: <bceba3fa-1529-4447-b1a5-b28da028af23@w34g2000yqm.googlegroups.com>
On Nov 26, 2:32 pm, Palooka <nob..._at_nowhere.com> wrote:
snip
> > I think you have a better handle on the issues than most people. It's
> > difficult to accept that the problem may have so many variables that
> > reactive empirical testing is the only way to go. Jonathan Lewis has
> > published some on attempting to figure out if the data layout would
> > benefit from reorganization, sometimes the data quickly goes back to
> > it's old shape. I have at least one table/report combination that
> > runs noticeably slower (7 hours v. 5 hours) on newly added data
> > apparently because of the strangely skewed distribution on entry.
>
> You have reports which take 5-7 hours? Reorganising tables and
> rebuilding indexes is not going to help much.
>
> Redesign, introduce some redundancy through mviews, set up a data mart
> or something.
Well actually that was Joel throwing in the long running reports ( not the OP who is considering some reorgs/rebuilds ). Received on Thu Nov 27 2008 - 04:07:16 CST