Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Performance issue while loading large amount of data

Re: Performance issue while loading large amount of data

From: Wanderley <wces123_at_yahoo.com>
Date: Sat, 11 Jan 2003 18:50:19 GMT
Message-ID: <LHZT9.34340$Pb.923348@twister.austin.rr.com>


Noons wrote:
> "EP" <ep_at_plusnet.pl> wrote in news:avopck$e4h$1_at_news.tpi.pl and I quote:
>
>
>

>>I am supposed to load large amount of data from SQL scripts and I have been
>>wondering what is the best option:
>>-frequent commits and no rollback segments growth
>>-one commit and large rollback segments
>>What do you suggest ?

>
>
>
> In-between.
>
>

I agree. As always, practice makes perfect. You have to find the sweet spot between no commits at all (which would require very large rollback segs) or too many commits (which would slow down your job).

Some methods are faster than others, though. For instance, depending on your version of Oracle and the kind of data you are loading (from flat files, from binary files, from other databases, etc), you could use direct load (sqlloader). Received on Sat Jan 11 2003 - 12:50:19 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US