Home » SQL & PL/SQL » SQL & PL/SQL » moving data to another table (oracle 11g)
moving data to another table [message #575233] Mon, 21 January 2013 10:05 Go to next message
kamilp
Messages: 10
Registered: August 2012
Junior Member
Hello !


I have imported data into database using sqlloader into flat table. Now I need to move the data from this table to another table. This is production system and I must keep it online. So I decided to make script that will move data in small chunks and commit frequently to avoid waits and table locks.
Regarding the script I have question. I can to the bulk load of rowids. Is it possible to optimize the insert and delete in similar way instead of doing insert/delete in loop for each rowid ?

Regards,
Thanks

declare
  type t_rowids is table of rowid;
  rowids t_rowids;
begin
 loop
   select rowid bulk collect into rowids from ims_old.values_f2 where rownum < 1000;
   if rowids.count < 1 then
     exit;
   end if;
   for indx in 1 .. rowids.count loop
     insert into ims.values_f select * from ims_old.values_f2 where rowid = rowids(indx);
     delete from ims_old.values_f2 where rowid = rowids(indx);
   end loop;
   insert into status values (0);
   commit;
 end loop;
end;
Re: moving data to another table [message #575234 is a reply to message #575233] Mon, 21 January 2013 10:10 Go to previous messageGo to next message
BlackSwan
Messages: 22470
Registered: January 2009
Senior Member
COMMIT inside LOOP results in LONGER elapsed time to completion & increases odd that ORA-01555 (Snapshot Too Old) will get thrown.

single INSERT into STATUS SELECT * FROM VALUES_F2 is best solution!
Re: moving data to another table [message #575239 is a reply to message #575233] Mon, 21 January 2013 10:28 Go to previous messageGo to next message
Michel Cadot
Messages: 58477
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
DBMS_PARALLEL_EXECUTE package was built for this need.

Regards
Michel
Re: moving data to another table [message #575265 is a reply to message #575233] Mon, 21 January 2013 16:08 Go to previous messageGo to next message
kamilp
Messages: 10
Registered: August 2012
Junior Member
I found it out - FORALL statement
Re: moving data to another table [message #575299 is a reply to message #575265] Tue, 22 January 2013 01:31 Go to previous messageGo to next message
Michel Cadot
Messages: 58477
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
FORALL applies to only 1 statement not 2.
So you have to do 2 FORALL "loops".

Anyway, I maintain that you should use DBMS_PARALLEL_EXECUTE package as it is built for this and support the cases where some errors raise can restart where it is stopped, without speaking that it can do it in parallel if you want.

Regards
Michel
Re: moving data to another table [message #575386 is a reply to message #575299] Tue, 22 January 2013 15:55 Go to previous messageGo to next message
Bill B
Messages: 1066
Registered: December 2004
Senior Member
inserting data does not generate any table locks except the inability to modify the table structure. Inserts generate very little redo. Your users can't access the new data until it is committed. just do

insert into my_tape
select * from test_table;

commit;
Re: moving data to another table [message #575387 is a reply to message #575386] Tue, 22 January 2013 16:03 Go to previous message
BlackSwan
Messages: 22470
Registered: January 2009
Senior Member
> Inserts generate very little redo.
above is NOT correct.
The whole INSERT statement including all values for the new row must exist in the REDO
to UNDO the INSERT only the ROWID is required to be deleted.

With regard to DML & REDO below is the list from largest to smallest
INSERT
UPDATE
DELETE

With regard to DML & UNDO below is the list from largest to smallest
DELETE
UPDATE
INSERT
Previous Topic: simple Shell Script - howto do the same in PL/SQL
Next Topic: Is it possible to identify/Count records in ref cursor without actually fetching
Goto Forum:
  


Current Time: Tue Jul 22 02:56:27 CDT 2014

Total time taken to generate the page: 0.11668 seconds