Re: copy large amounts of data to MSSQL

From: Tim Gorman <tim.evdbt_at_gmail.com>
Date: Fri, 6 Nov 2020 09:12:44 -0800
Message-ID: <cd0bb9bf-0f4f-1df1-ec6c-6f8686232e54_at_gmail.com>



For the part about dumping to a delimited file, I posted a DUMP_DATA package with a CSV function to unload an entire table, a partition, or a tablespaces worth of partitions to CSV format.  There are example scripts for calling it from Perl and from SQL*Plus also posted.

Look at the bottom of the page https://evdbt.com/scripts...

Hope this helps...

On 11/6/2020 5:10 AM, Shane Borden (Redacted sender sborden76 for DMARC) wrote:
> Just do a dump to a delimited file and use the BCP tool to load it.
>
> Shane Borden
> sborden76_at_yahoo.com
> Sent from my iPhone
>
>> On Nov 6, 2020, at 7:39 AM, Jeff Chirco <backseatdba_at_gmail.com> wrote:
>>
>> 
>> Thanks for the ideas. We want to avoid having the Cloud instance have
>> a direct connection to our database, hence why we want to push
>> instead. Plus that works out better for our users, when they are done
>> processing orders they initiate the send to the cloud vendor. This
>> will be a multi day event to push to MSSQL.
>> I've been testng with 5000 rows, right now it is taking 2.5 minutes
>> before it was  6-8 minutes.  I believe the latency to OCI is
>> fluctuating, currently I am getting 14ms ping times, however it is
>> early in the morning here in California as I am working on a 19c
>> upgrade.  We've had ping latency in the upper 60ms ranges and we've
>> experience a lot worse around 8:30am with packet loss.
>>
>> I was thinking the same thing about a Perl or Python script "might"
>> be faster.  Any one happen to have a sample of that?
>>
>>
>> On Fri, Nov 6, 2020 at 1:35 AM Frank Gordon <frankagordon_at_gmail.com
>> <mailto:frankagordon_at_gmail.com>> wrote:
>>
>> Hello,
>>
>> Have you tried doing a pull from the MS-SQL-Server side?
>> How much data in the table?
>> How "far" apart are the Oracle and MS sides?
>> How you tried sending one row and timing that?
>> Look at the MS-SQL-Server funtion OPENQUERY.
>>
>> Regards,
>> Frank
>>
>>
>> On Fri, Nov 6, 2020 at 12:31 AM Jeff Chirco
>> <backseatdba_at_gmail.com <mailto:backseatdba_at_gmail.com>> wrote:
>>
>> We have a new project where we need to copy a large dataset
>> from Oracle (on prem) to SQL Server (cloud).
>>
>> I found that just a
>> insert into table_at_mssql
>> select * from table;
>>
>> Returns ORA-02025: all tables in the SQL statement must be at
>> the remote database
>>
>> I am trying out the SQL Server Gateway driver from Oracle
>> because the documentation looks like it would work with the
>> Remote Insert Rowsource
>> https://docs.oracle.com/en/database/oracle/oracle-database/12.2/gmswn/database-gateway-for-sqlserver-features.html#GUID-513FBA3C-3458-4129-93E4-38DB2DF97F7A
>>
>> However I get the same error. Does anyone know if this should
>> work?
>>
>> We've converted it to a PL/SQL loop inserting row by row but
>> 5000 rows is taking 8 minutes. Oh and the funny thing is that
>> this SQL Server instance is running on Oracle Cloud
>> Infrastructure.
>>
>> Any other ways you can think of to do this quick? 8 minutes
>> is killer.
>>
>> Thanks,
>> Jeff
>>
>>
>>
>> --
>> +353-86-0695383
>>

--
http://www.freelists.org/webpage/oracle-l
Received on Fri Nov 06 2020 - 18:12:44 CET

Original text of this message