Re: Estimating effectivity of performance improvement measures
Date: Mon, 1 Dec 2008 14:10:03 -0800 (PST)
On Nov 27, 2:07 am, hpuxrac <johnbhur..._at_sbcglobal.net> wrote:
> On Nov 26, 2:32 pm, Palooka <nob..._at_nowhere.com> wrote:
> > > I think you have a better handle on the issues than most people. It's
> > > difficult to accept that the problem may have so many variables that
> > > reactive empirical testing is the only way to go. Jonathan Lewis has
> > > published some on attempting to figure out if the data layout would
> > > benefit from reorganization, sometimes the data quickly goes back to
> > > it's old shape. I have at least one table/report combination that
> > > runs noticeably slower (7 hours v. 5 hours) on newly added data
> > > apparently because of the strangely skewed distribution on entry.
> > You have reports which take 5-7 hours? Reorganising tables and
> > rebuilding indexes is not going to help much.
> > Redesign, introduce some redundancy through mviews, set up a data mart
> > or something.
> Well actually that was Joel throwing in the long running reports ( not
> the OP who is considering some reorgs/rebuilds ).
And of course, he was responding to me ;-)
I had indeed done what Palooka suggested on some related reports that were not finishing in a workday - with better hardware, they now run in a few minutes. For this one, the person can just wait, I could probably get it down to less than an hour or maybe a few minutes, but the business has more pressing concerns. I just wanted to point out noticeable slowness with new data due to physical data distribution. I did have to spend time figuring out if it had something to do with the upgrade, since at first it looked like what used to take 5 hours on the older slower hardware on 9i now took 7 hours on new faster hardware on 10g, there were other considerations confusing the issue.
-- @home.com is bogus. http://www.platform21.nl/page/382/enReceived on Mon Dec 01 2008 - 16:10:03 CST