Re: SQL Loader Vs DBLINK to Migrate Data

From: Jack van Zanen <jack_at_vanzanen.com>
Date: Wed, 9 Nov 2016 10:11:08 +1100
Message-ID: <CAFeFPA8gmGA=cN1JgU-ZPy1i5FGLTnDC9hmDbhbywd3WTR5YQw_at_mail.gmail.com>



well you don't have to use port 1521.
you can run another listener on another port and with some sqlnet.ora parameters block everything but the production box from connecting. or
use the firewall to only allow traffic between production box and no production box on the new port (probably better solution)

Jack van Zanen



This e-mail and any attachments may contain confidential material for the sole use of the intended recipient. If you are not the intended recipient, please be aware that any disclosure, copying, distribution or use of this e-mail or any attachment is prohibited. If you have received this e-mail in error, please contact the sender and delete all copies. Thank you for your cooperation

On Wed, Nov 9, 2016 at 7:07 AM, William Ndolo Intertek < william.ndolo_at_intertek.com> wrote:

> Thanks all for the response.
>
> It is was a onetime activity but now we are working towards a nightly
> migration process.
>
> Currently a small database(60G) migrating just a group of select tables.
>
> My choice is between dblink or customized stored procedures.
>
> DBLINK seems to be my preferred choice at this time.
>
> SQL Loader will work but we are cognizant of non-English characters.
>
> We are having some kind of discussion over DBLINK security concerns,
> especially when port 1521 is open between prod and non-prod.
>
> Anyone can chime in on that kind of concern?
>
>
>
> Tim,
>
> Let’s have a chat sometime over data virtualization. Maybe this can be
> our long term solution.
>
>
>
>
>
>
>
>
>
> Thanks and best regards,
>
>
>
>
>
> *From:* oracle-l-bounce_at_freelists.org [mailto:oracle-l-bounce_at_
> freelists.org] *On Behalf Of *Tim Gorman
> *Sent:* Thursday, November 03, 2016 7:29 AM
> *To:* ORACLE-L <oracle-l_at_freelists.org>
> *Subject:* Re: SQL Loader Vs DBLINK to Migrate Data
>
>
>
> William,
>
>
> Is this a one-time migration or a repeated process pipeline?
>
> If it is a one-time migration, then a command like "INSERT /*+ APPEND
> PARALLEL */ INTO local-table SELECT /*+ PARALLEL */ ... FROM
> remote-table_at_remote-db" can transfer a remarkable amount of data in a
> very short time.
>
> The fastest database migration I ever did involved using Perl to dump data
> into a UNIX FIFO (a.k.a. named pipe), then using "bbcp" to read from the
> FIFO across the network to another FIFO on a remote database server, then
> using SQL*Loader direct-path to insert the data on the remote side.
> Breaking things up into multiple parts enabled us to tune each step
> separately, plus "bbcp" transparently multi-threads and compresses the
> network transfer so that you can achieve near line-speed. For example, in
> the "dump" phase, we had started off using SQL*Plus but found that a simple
> Perl program kicks its butt by a factor of 10. Also, in the "network
> transfer" phase, we were able to fiddle with TCP window sizes and TCP
> "jumbo frames" and re-test "bbcp" repeatedly until we found the ideal
> settings for throughput. If you need a good Perl program for dumping in
> CSV format, check the four scripts comprising the DUMP_DATA package posted
> at http://evdbt.com/scripts.
>
> However, that is a solution for a one-time migration. If you are building
> a repeated movement of data as an ongoing process, then I'm going to put my
> Delphix "vendor hat" on and suggest that you investigate data
> virtualization, which is designed for exactly this. Cloning data by
> pushing and storing terabytes repetitively as we have the past 30+ years is
> no longer the best solution, especially when cloning large datasets to
> support agile development or testing for continuous integration and
> continuous delivery. Data virtualization enables the fast provisioning and
> refresh of many full and independent read-write copies of data using less
> storage. If you need more information, please contact me off-list or check
> docs.delphix.com.
>
> Hope this helps.
>
> Thanks!
>
> -Tim
>
>
> On 11/1/16 07:37, William Ndolo Intertek wrote:
>
> I am setting up a process of moving data from UAT to Test environment and
> eventually to production.
>
> The databases are very small at this time(about 10G each for both Test and
> UAT).
>
> We expect to use the same method in production but the production
> databases are projected to grow rapidly into hundreds of terabytes.
>
> At this point, DBLINK seems to be doing the job but considering sqlloader
> as an alternative when the databases get large.
>
> There are many other tools/methods however, we are looking for something
> simple that can be automated.
>
> Can anyone share their experience with both and other oracle tools?
>
> Can anyone point me to Oracle documentation that does that kind of
> comparison or maybe recommends one as opposed to the other?
>
>
>
> Thanks and best regards,
>
>
>
> Bill
>
>
> Valued Quality. Delivered.
> ------------------------------
> CONFIDENTIALITY NOTICE
> This e-mail may contain confidential or privileged information, if you are
> not the intended recipient, or the person responsible for delivering the
> message to the intended recipient then please notify us by return e-mail
> immediately. Should you have received this e-mail in error then you should
> not copy this for any purpose nor disclose its contents to any other person.
>
> Intertek is dedicated to Customer Service and welcomes your feedback.
> Please visit http://www.intertek.com/email-feedback/ to send us your
> suggestions or comments. We thank you for your time.
>
> Except where explicitly agreed in writing, all work and services performed
> by Intertek is subject to our Standard Terms and Conditions of Business
> which can be obtained at our website: http://www.intertek.com/terms/
> Should you have any difficulty obtaining these from the web site, please
> contact us immediately and we will send a copy by return.
>

--
http://www.freelists.org/webpage/oracle-l
Received on Wed Nov 09 2016 - 00:11:08 CET

Original text of this message