Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Mailing Lists -> Oracle-L -> RE: Question re statistics

RE: Question re statistics

From: Kerber, Andrew W. <>
Date: Fri, 30 Nov 2007 07:10:25 -0600
Message-ID: <>

You might try getting a good set of statistics for the largest size the table is likely to be, storing them in a stat table so you have a copy, then locking the statistics on the table of concern

-----Original Message-----
[] On Behalf Of William Wagman Sent: Thursday, November 29, 2007 7:12 PM To:
Subject: Question re statistics


I have come up against this situation before and have never been quite sure how to handle it. Running 10gR2 on RHEL 4 EE. In a home grown application there are a series of tables in which records are either inserted or deleted such that the number of records in the table is constantly changing. It is not a large table, the maximum number of records at any one time is 5000 but may vary between 0 and 5000. What they are seeing is that as updates to the table proceed performance degrades because allegedly statistics become old and unhelpful. So what they have done is create a job which runs every few minutes and analyzes these tables so the statistics stay fresh and performance stays up to their standards. Here is a description of one of the tables in question -

SQL> desc org.requests

 Name                                      Null?    Type
 ----------------------------------------- --------
 REQUESTOID                                NOT NULL RAW(16)
 REQUESTGROUPOID                           NOT NULL RAW(16)
 REQUEST_TYPE                              NOT NULL CHAR(1)
 REQUEST_PERSONOID                         NOT NULL RAW(16)
 OBJECT_TABLE_CODE                                  CHAR(2)
 ORIGINAL_OID                                       RAW(16)
 PENDING_OID                                        RAW(16)
 REQUEST_ACTION                                     CHAR(1)
 REQUEST_DATE                                       DATE

I haven't yet traced a session. I've looked at awr reports but am not seeing anything helpful there, yet.

So, my question, and I realize it is an open ended one, can someone offer suggestions for resolving this issue and things to look at to track down the problem. It just strikes me as being very inefficient, silly almost, to analyze these tables every 5 minutes but I don't know how to approach a solution.


Bill Wagman
Univ. of California at Davis
IET Campus Data Center
(530) 754-6208


NOTICE:  This electronic mail message and any attached files are confidential.  The information is exclusively for the use of the individual or entity intended as the recipient.  If you are not the intended recipient, any use, copying, printing, reviewing, retention, disclosure, distribution or forwarding of the message or any attached file is not authorized and is strictly prohibited.  If you have received this electronic mail message in error, please advise the sender by reply electronic mail immediately and permanently delete the original transmission, any attachments and any copies of this message from your computer system. Thank you.


Received on Fri Nov 30 2007 - 07:10:25 CST

Original text of this message