Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Most efficient large update?

Re: Most efficient large update?

From: Daniel A. Morgan <dmorgan_at_exesolutions.com>
Date: Sat, 26 May 2001 19:45:40 -0700
Message-ID: <3B106A54.64822210@exesolutions.com>

broom wrote:

> I'm trying to figure out the most efficient
> method of populating a field from one table
> to another. I've got to update about 100 million
> records, drawing from about 20 million.
>
> From_table:
>
> key - char(10)
> from_data - char(10)
>
> To_table:
> key - char(10)
> about 100 fields.
> to_data - char(10)
> 50 more fields.
>
> Goal is to populate the to_data
> with the from_data, by 'key'.
>
> I figure my best case would be a
> pl/sql loop, select bulk gathering
> 100,000 records at a time. For each
> of these loops, doing a bulk update
> of the target table, committing in
> batches of 100,000 records.
>
> I'm blathering based on wandering though
> some Oracle books, being a complete newbie
> to pl/sql.
>
> Would anybody be able to point me to an example
> of this type code?
>
> Thanks.
>
> If possible, please email and I'll summarize.
>
> Barry
>
> broom-nospam_at_voicenet.com

You try to update 100,000 records in one shot and you will likely not succeed due to rollback segment space.

First step should be to configure the largest rollback segment extent size you can. And determine, roughly, how many transactions it will hold. Something less than that is your commit point.

Then write a stored proc to perform the updates with a counter determining how often to commit.

Daniel A. Morgan Received on Sat May 26 2001 - 21:45:40 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US