Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Large set-based UPDATE

Re: Large set-based UPDATE

From: Billy Verreynne <vslabs_at_onwe.co.za>
Date: Wed, 12 Feb 2003 14:51:36 +0000
Message-ID: <b2dg1v$8oh$1@ctb-nnrp2.saix.net>


Paul wrote:

> I am reviewing an Oracle stored procedure (BULK_UPDATE) that makes a
> large set-based update to a large table (ACCOUNTS, 5 million rows).
>
> The problem we are encountering with this is that, because the UPDATE
> statement applies changes to a large number of rows at a time to the
> ACCOUNTS table (around 150,000), a web layer that also allows updates
> to this table is being regularly locked out for an unacceptable length
> of time.

<snipped>

> I was thinking a possible solution to this is to break the 150,000 row
> set based update into 150,000 updates to single rows by processing
> them through a cursor. I would then COMMIT within the cursor every
> 1000 rows.

Cursor based processing, especially when breaking a bulk update into smaller chunks like that, will be (significantly I think) slower.

Will it solve the problem?

What guarantee do you have if you lock 150k rows, that the web based front-end will not still wait as it wants to update one of those locked rows?

What about data integrity? What happens when your loop update processing fails after the 3rd loop iteration? Do you need to roll the updates already made back..? (not really possible) How do you determine where to restart your update loop? How do you stop the next batch update from running on a partially updated table, and making even a bigger mess of the existing mess?

How do the web update and batch update relate to one another? Are they updating the same columns? What happens if one updates before the other?

An update should be made to fit the business requirements. It should be made to ensure data integrity. *That* is what a database must protect, before any consideration is given to making things faster. What will a 3 second response mean to the business if its data is suspect and integrity not enforced?

I suggest dealing with this issue first at a higher level, before trying to solve it technically. Besides, breaking a large batch updates up into smaller updates is not the only solution.

You need to look at bussiness issues & requirements such as data intergrity first. In particular, what business requirement gave rise to this update conflict? In general, it is a Very Bad Idea (tm) to have batch processing and OLTP processing on the same table.

Only then you can decide how to best address this particular conflict.

--
Billy
Received on Wed Feb 12 2003 - 08:51:36 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US