Re: SQL Loader Vs DBLINK to Migrate Data

From: Tim Gorman <tim.evdbt_at_gmail.com>
Date: Thu, 3 Nov 2016 06:28:51 -0600
Message-ID: <5c0a683b-855a-3995-a885-b67f558e508e_at_gmail.com>



William,

Is this a one-time migration or a repeated process pipeline?

If it is a one-time migration, then a command like "INSERT /*+ APPEND PARALLEL */ INTO local-table SELECT /*+ PARALLEL */ ... FROM remote-table_at_remote-db" can transfer a remarkable amount of data in a very short time.

The fastest database migration I ever did involved using Perl to dump data into a UNIX FIFO (a.k.a. named pipe), then using "bbcp" to read from the FIFO across the network to another FIFO on a remote database server, then using SQL*Loader direct-path to insert the data on the remote side. Breaking things up into multiple parts enabled us to tune each step separately, plus "bbcp" transparently multi-threads and compresses the network transfer so that you can achieve near line-speed. For example, in the "dump" phase, we had started off using SQL*Plus but found that a simple Perl program kicks its butt by a factor of 10. Also, in the "network transfer" phase, we were able to fiddle with TCP window sizes and TCP "jumbo frames" and re-test "bbcp" repeatedly until we found the ideal settings for throughput. If you need a good Perl program for dumping in CSV format, check the four scripts comprising the DUMP_DATA package posted at http://evdbt.com/scripts.

However, that is a solution for a one-time migration. If you are building a repeated movement of data as an ongoing process, then I'm going to put my Delphix "vendor hat" on and suggest that you investigate data virtualization, which is designed for exactly this. Cloning data by pushing and storing terabytes repetitively as we have the past 30+ years is no longer the best solution, especially when cloning large datasets to support agile development or testing for continuous integration and continuous delivery. Data virtualization enables the fast provisioning and refresh of many full and independent read-write copies of data using less storage. If you need more information, please contact me off-list or check docs.delphix.com.

Hope this helps.

Thanks!

-Tim

On 11/1/16 07:37, William Ndolo Intertek wrote:
>
> I am setting up a process of moving data from UAT to Test environment
> and eventually to production.
>
> The databases are very small at this time(about 10G each for both Test
> and UAT).
>
> We expect to use the same method in production but the production
> databases are projected to grow rapidly into hundreds of terabytes.
>
> At this point, DBLINK seems to be doing the job but considering
> sqlloader as an alternative when the databases get large.
>
> There are many other tools/methods however, we are looking for
> something simple that can be automated.
>
> Can anyone share their experience with both and other oracle tools?
>
> Can anyone point me to Oracle documentation that does that kind of
> comparison or maybe recommends one as opposed to the other?
>
> Thanks and best regards,
>
> Bill
>

--
http://www.freelists.org/webpage/oracle-l
Received on Thu Nov 03 2016 - 13:28:51 CET

Original text of this message