Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.tools -> Re: HELP! --- Best way to delete large amounts of data.

Re: HELP! --- Best way to delete large amounts of data.

From: Shawn Ellinger <sellinge_at_csac.com>
Date: Wed, 20 Dec 2000 02:26:25 GMT
Message-ID: <3A401644.A791668E@csac.com>

First I would tell you good luck. I know that custom code usually takes a while to write and you feel good about it. But having 25M records to process is going to take a while.... no doubt it will take a while. But there are probably some ways you can look at it to speed it up.

Find out how long it takes to process one main record. Are there any slow spots. Since you are processing 25M of them there could be a spot to look. Turn tracing on. You can do this without shutting down the database. Analyze the trace file.

Set your db_block_buffers to a higher size. This will allow more records to be put in memory for a faster search.

How many records are you commiting. If you have too many your log_buffers might need to be adjusted or commit a smaller number of records.

Where are the records being archived to? The same disk.. different disk.. same IO port... distribute the load the best as possible.

Sometimes when you have looked at all this and it still takes three days... it still takes three days.

Shawn Received on Tue Dec 19 2000 - 20:26:25 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US