Re: Massive Update commit every 1000 records
Date: Sat, 14 Nov 2009 07:11:40 +1100
Fetching in an array operation (FETCH .. BULK COLLECT INTO .. LIMIT ..) and then updating as an array operation (FORALL .. UPDATE ..) would be a lot more efficient than how you're proposing here.
On Sat, Nov 14, 2009 at 6:50 AM, Roger Xu <wellmetus_at_gmail.com> wrote:
> Hi List,
> Background: 9i (184.108.40.206.0) HP-UX B.11.31 U ia64
> We need to shrink SSN to the last 4 digit.
> update sales set ssn=to_number(substr(to_char(ssn),-4));
> But the sales table is quite large and we want to commit every 1000 records.
> Does anyone have a better idea to do this rather than below?
> cursor c12 is select ssn from sales for update of ssn;
> n number := 0;
> for eachc12 in c12 loop
> if eachc12.ssn is not null then
> n := n + 1;
> update sales set ssn=to_number(substr(to_char(ssn),-4)) where current
> of c12;
> if ( n = 1000 )
> end if;
> end if;
> end loop;