Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> c.d.o.server -> Re: Java/JDBC for dumping MS SqlServer rows to a text file ?
Alberto Dell'Era wrote:
<snipped>
> We must migrate just the data;
> Our task is complicated by us being located in Italy, where
> accented, non-ASCII characters are the norm. Our database has
> been of course already prepared with the correct character sets,
> and so on.
>
> We have evaluated different options (using the routines in sqlserver
> for direct migration; using the same routines for producing an
> intermediate dump file to be loaded by sql*loader; using oracle
> migration workbench; using the Oracle transparent gateways).
The sizes you mention, 3,500,000 rows @ 1,5K per row, is IMO fairly small.
SQL*Loader can easily deal with that. Here's a snip from a SQL*Loader log
file of data I loaded a week ago:
45,592,357 Rows successfully loaded.
Elapsed time was: 01:48:20.25 CPU time was: 00:10:05.93
The file loaded was 1,771 MB of compressed (i.e. zipped) data. Uncompressed.. I think it is about 4 to 5 GB's worth of data.
Thus, when it comes to getting data into Oracle, very little IMO beats SQL*Loader ito performance and flexibility.
Getting data out of SQL-Server.. well, I can not recall having any unpleasant experiences with BCP on SQL-Server 4.x and 6.x.
You mentioned non-standard charater sets. I have in the past loaded raw binary EBCDIC data on a Unix platform into Oracle, using SQL*Loader. EBDCIC to ASCII conversion was done in parallel with the loading via SQL*Loader. The data being loaded as RAW (a column data type in Oracle) for further manipulation via SQL.
I suggest sticking to standard tools. In this case BCP and SQL*Loader. Purchasing and/or testing other tools for a once-off data transfer... that IMO is a waste of time that could rather be spend on doing the transfer, dealing with the issues, and getting it done.
-- BillyReceived on Fri Jan 17 2003 - 09:36:04 CST