Re: How is the best way to do a 100,000 row update?

From: Fuzzy <grant_at_towersoft.com.au>
Date: 1996/04/26
Message-ID: <4lppfn$309_at_red.interact.net.au>#1/1


dixon2_at_ix.netcom.com (Mike Dixon) wrote:

>I have a stored procedure that will read through an item file and if
>the item changed then go change all the sales history that goes with
>that item. This could be done with one command but 100,000 rows would
>be just a little too big for the rollback segment. The table I will
>be changing will have over 12 million rows but it does have an index
>on the item column. What would be the most efficient way to process
>this?

Create a large rollback segment, and force your transaction to use this large segment. Try it with a 5, 10 or even 20Mb rollback segment, and see what happens.

Alternatively, if you don't like using transactions/forcing, you could take all other rollback segments offline, put large segment on line, perform operations, and then restore segments to original status.

There's probably many more ways, and better ones too :-) :-) :-)

Ciao
Fuzzy
:-)


Grant Allen                    | "Woo Hoo!"
grant_at_towersoft.com.au         | H. Simpson
granta_at_student.canberra.edu.au |

The contents of this post are my opinions only.

      If swallowed, seek medical advice. Received on Fri Apr 26 1996 - 00:00:00 CEST

Original text of this message