Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Mailing Lists -> Oracle-L -> RE: normalization

RE: normalization

From: Steve Adams <steve.adams_at_ixora.com.au>
Date: Wed, 2 Nov 2005 15:25:51 +1100
Message-ID: <002c01c5df65$8221c7b0$0a0a0a0a@ixora.com.au>


Hi Ryan,

Just test to scale and measure it.
Of course, CPU usage is affected by many things, so it will not be easy to express the results in a way that might be meaningful to management. Nevertheless, in cases like this, I think that showing a non-trivial difference is enough.

@ Regards,
@ Steve Adams
@ http://www.ixora.com.au/ - For DBAs
@ http://www.christianity.net.au/ - For all

-----Original Message-----
From: Ryan [mailto:ryan_gaffuri_at_comcast.net] Sent: Wednesday, 2 November 2005 3:17 PM To: steve.adams_at_ixora.com.au
Cc: 'Oracle-L Freelists'
Subject: Re: normalization

Steve,

Do you know a way to take the data you presented and do a test to show how much additional CPU usage this will require under specific conditions?

if the trigger is fired 'X' number of times in a certain period, how much increased CPU will I need?
How much will this reduce response time and/or scalability?

If I was to present this to a manager, I would typically need to show something more tangible. Just because certain data elements that I gather from the data dictionary increase, does not mean it will cause a problem for us. So I would need to show realistic estimates.

any ideas on how to do this type of test?

--
http://www.freelists.org/webpage/oracle-l
Received on Tue Nov 01 2005 - 22:28:04 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US