Home » RDBMS Server » Performance Tuning » Big data Inserts
Big data Inserts [message #188465] Fri, 18 August 2006 12:47 Go to next message
Messages: 97
Registered: June 2005
Dear ALL,

Here is my concern. We have a huge table say >130GB and we need to keep a copy of it:exactly same data for one time. we will call this table:Old_table. It has to be done within the database meaning no exports and Imports possible since the user would not have access to the OS to build the dump files.

Is the insert into old_table select * from ... with append/ parallel hints the only option for this data copy. I am wondering whether the inserts could create trouble for undo segments since only one commit would be done after last rows successfully inserted. However my feeling is that, inserts always generate very low undo. What the other options you guys could think to work out in this case.

Has any one done any work with bulk inserts and do they have any advantage over serial append hinted inserts. Any help would be highly appreciated.

Re: Big data Inserts [message #188483 is a reply to message #188465] Fri, 18 August 2006 13:29 Go to previous messageGo to next message
Messages: 460
Registered: July 2005
Location: Stockton, California - US...
Senior Member
How about:
Begin transaction;
FROM your_table AS OF TIMESTAMP '2002-02-05 14:15:00';
Re: Big data Inserts [message #191827 is a reply to message #188465] Fri, 08 September 2006 05:03 Go to previous messageGo to next message
Messages: 201
Registered: August 2006
Location: Denmark
Senior Member

No details about Your system and the table, so I assume 10gR2, and no lobs etc...

You have many options. If this is really huge, parallel could be very nice.

Using flashback, would require that Your database has enough undo to hold all changes from start to end of Your load.

Another approach could be to LOCK the table, to prevent any changes from occurring, and then insert say 1 mio rows at a time, if much undo/redo is generated.

But why not try it out, to see what is feasible on Your system?

Re: Big data Inserts [message #191868 is a reply to message #191827] Fri, 08 September 2006 07:48 Go to previous messageGo to next message
Messages: 3727
Registered: October 2005
Location: Melbourne, Australia
Senior Member
It's been nearly 3 weeks. I reckon @nmacdanalds solution would probably have finished by now. Wink
Re: Big data Inserts [message #191896 is a reply to message #188465] Fri, 08 September 2006 09:17 Go to previous message
Messages: 52
Registered: August 2006
Location: India


You can use the utl_file utility to take the comaplete rows from the table to a OS level falt, test or csv file. The total records will come out from oracle table to OS level file.

And also you can insert the rows into oracle tables from a flat file. If you want your table should partioned and store the data into physially and logically seperatly. You have use the sqlloader.

If you want we have solution.


Mohan Reddy G.
Previous Topic: inserting data in append mode .. what adv ??
Next Topic: other buffer pool .... cache sizes ??
Goto Forum:

Current Time: Fri Aug 18 04:02:37 CDT 2017

Total time taken to generate the page: 0.11776 seconds