Re: data pump

From: rjamya <rjamya_at_gmail.com>
Date: Fri, 5 Apr 2013 07:48:42 -0400
Message-ID: <CAGurbTOSEJDxM9Jk4Bx+f_Dh-FJom4WvsE7NABb9sKM-gBv1-w_at_mail.gmail.com>



Oh yes, as Nigel said, if you create a proper delimited file, you can create independent sqlloader control files, you can fire them off in parallel. Much much better than individual insert statements. Use direct path option and load even faster.
If you want to add data transformation while loading raw data? see if you can use external table interface. For large files, split them (i use unix split command), and then use parallel option.

If you are loading into an empty db, it might make sense to drop indexes, disable constraints and after load add indexes and re-enable constraints. and dont forget stats :)

Raj

On Fri, Apr 5, 2013 at 2:42 AM, jo <jose.soares_at_sferacarta.com> wrote:

> my real problem is to upload a database from postgres into Oracle, because
> it takes about 11 hours (using cx_Oracle)
>

> It takes about 11 hours to upload into Oracle an INSERTs file format.
>
> Is there a faster way to upload a db into Oracle from a text file ?
> something similar to postgres' COPY...
>

--
http://www.freelists.org/webpage/oracle-l
Received on Fri Apr 05 2013 - 13:48:42 CEST

Original text of this message