Home » SQL & PL/SQL » SQL & PL/SQL » Faster way to update large amount of data
Faster way to update large amount of data [message #411141] Thu, 02 July 2009 02:50 Go to next message
carillpower
Messages: 48
Registered: April 2009
Location: malaysia
Member

Hi all guru's...i got doubt here..

I have a proc which will change data in all table which hav certain column...its like this..

CREATE OR REPLACE
PROCEDURE process_A IS

cursor c1 is
select distinct(table_name) t_name from ...;



BEGIN


for rec in c1 loop

execute immediate '
update '||rec.t_name||' set data_A = translate(data_A,''0123456789'',''1234567890'')';

end loop;

END SCRAMBLING_JOB;


The process is quite simple but because of the data is too many...the time it takes was so long...do anyone of u could give me any tips how to make the proc execute more faster

Thanks a lot
Re: Faster way to update large amount of data [message #411142 is a reply to message #411141] Thu, 02 July 2009 02:56 Go to previous messageGo to next message
Michel Cadot
Messages: 64103
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
There is none.
You can execute several procedures in parallel... if you have io bandwith for it.

Regards
Michel
Re: Faster way to update large amount of data [message #411148 is a reply to message #411141] Thu, 02 July 2009 03:03 Go to previous messageGo to next message
JRowbottom
Messages: 5933
Registered: June 2006
Location: Sunny North Yorkshire, ho...
Senior Member
You could use the parallel hint and make each update statement run in parallel
Re: Faster way to update large amount of data [message #411176 is a reply to message #411141] Thu, 02 July 2009 05:03 Go to previous messageGo to next message
_jum
Messages: 508
Registered: February 2008
Senior Member
If there is an INDEX on data_A the UPDATE could be faster without or with marked as UNUSABLE IINDEX. See the sample below. I simple see the time difference, may be the gurus have something to remark, esp. in production ?
DROP TABLE test_107 CASCADE CONSTRAINTS;

CREATE TABLE test_107 AS
        SELECT     TRUNC(SYSDATE - DBMS_RANDOM.VALUE * 365) datum,
                   DBMS_RANDOM.STRING('x', 20) col1
              FROM DUAL
        CONNECT BY LEVEL <= 100000;

SET TIMING ON;

UPDATE test_107
   SET col1 = TRANSLATE(col1, '0123456789', '1234567890');

CREATE INDEX indx_test_107 ON test_107(col1);

UPDATE test_107
   SET col1 = TRANSLATE(col1, '0123456789', '1234567890');

ALTER INDEX enviam.indx_test_107 UNUSABLE;

UPDATE test_107
   SET col1 = TRANSLATE(col1, '0123456789', '1234567890');

ALTER INDEX enviam.indx_test_107 REBUILD;

UPDATE test_107
   SET col1 = TRANSLATE(col1, '0123456789', '1234567890');

Table dropped.
Table created.
100000 rows updated.
Elapsed: 00:00:01:46
Index created.
Elapsed: 00:00:00:18
100000 rows updated.
Elapsed: 00:00:07:12
Index altered.
Elapsed: 00:00:00:00
100000 rows updated.
Elapsed: 00:00:01:70
Index altered.
Elapsed: 00:00:01:29
100000 rows updated.
Elapsed: 00:00:07:31

Re: Faster way to update large amount of data [message #411182 is a reply to message #411176] Thu, 02 July 2009 05:26 Go to previous messageGo to next message
JRowbottom
Messages: 5933
Registered: June 2006
Location: Sunny North Yorkshire, ho...
Senior Member
The big disadvantage with this method is that it affects other users.

If you just performa an update and commit the changes when they're finished, then the table remains completely usable to other users at all times.

If you make the indexes unusable, you run the risk of severely impacting other peoples sessions.
Re: Faster way to update large amount of data [message #411506 is a reply to message #411182] Sat, 04 July 2009 07:48 Go to previous message
rleishman
Messages: 3724
Registered: October 2005
Location: Melbourne, Australia
Senior Member
It will be many, many times faster to rebuild the table using CREATE TABLE ... AS SELECT....

Ross Leishman
Previous Topic: update select query
Next Topic: call procedure with OUT parameter inside another procedure
Goto Forum:
  


Current Time: Sat Dec 03 21:57:29 CST 2016

Total time taken to generate the page: 0.07385 seconds