Re: cursor loops
From: rjamya <rjamya_at_gmail.com>
Date: Wed, 30 May 2012 12:15:43 -0400
Message-ID: <CAGurbTPX+r1wqh4PHVwOFLfjGpYLoRdn-HOQt1vO1mFVwS-vVA_at_mail.gmail.com>
I second Jared's recommendation for using external tables. you can use them in parallel if large files can be split into multiple. Plus you can use the magic of SQL to do necesasary transformation as well. I have had great success with them. In once case the preferred ETL tool couldn't do the job properly for a 6m-20m rows file in allotted time. We managed to split the file into pieces, used parallel processing (one thread for each file piece) for a single external table, and then use dbms_errorlog to capture invalid data while loading into staging tables.
Date: Wed, 30 May 2012 12:15:43 -0400
Message-ID: <CAGurbTPX+r1wqh4PHVwOFLfjGpYLoRdn-HOQt1vO1mFVwS-vVA_at_mail.gmail.com>
I second Jared's recommendation for using external tables. you can use them in parallel if large files can be split into multiple. Plus you can use the magic of SQL to do necesasary transformation as well. I have had great success with them. In once case the preferred ETL tool couldn't do the job properly for a 6m-20m rows file in allotted time. We managed to split the file into pieces, used parallel processing (one thread for each file piece) for a single external table, and then use dbms_errorlog to capture invalid data while loading into staging tables.
Raj
On Wed, May 30, 2012 at 11:44 AM, Jared Still <jkstill_at_gmail.com> wrote:
> You may want to consider external tables and skip a lot of the code. It
> can be done much simpler with a SQL statement.
>
-- http://www.freelists.org/webpage/oracle-lReceived on Wed May 30 2012 - 11:15:43 CDT