Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Loading 300 million records

Loading 300 million records

From: AlexK <alexk84_at_hotmail.com>
Date: 6 Jun 2005 08:40:47 -0700
Message-ID: <1118072447.344657.22030@g49g2000cwa.googlegroups.com>


Hi All,

  I'm trying to improve the performance of a procedure that loads approximately 300 million records into a table. Currently the process is using 'INSERT /*+ APPEND*/ .. INTO SELECT' and takes about 10 hours to run on 10G. The select joins about 5 tables, most of which are small except the driving table which has about 300M records in it. I believe the indexes are good as explain plan only shows the main table getting a full table scan. From what I've read online, it says that the 'INSERT .. INTO SELECT' is the fastest and most efficient way to load data from one table to another. Unfortunately the only examples I've seen where the quantity of records inserted are mentioned only deals with about 1 million records at most. Is this still the best approach to take when loading 300M records? Would a bulk collect or something else be better since so many records are being processed? Any information would be greatly appreciated. Thanks in advance.

Alex Received on Mon Jun 06 2005 - 10:40:47 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US