Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: oracle table to flat file?

Re: oracle table to flat file?

From: Mark Bole <makbo_at_pacbell.net>
Date: Fri, 06 Aug 2004 02:10:29 GMT
Message-ID: <pMBQc.865$wG2.340@newssvr29.news.prodigy.com>


Joel Garry wrote:

> "Howard J. Rogers" <hjr_at_dizwell.com> wrote in message news:<411020f1$0$25458$afc38c87_at_news.optusnet.com.au>...
>

>>"Chris" <cs123._no_spam__at_telstra.com> wrote in message
>>news:zfTPc.31685$K53.7730_at_news-server.bigpond.net.au...
>>
>>

[...]
>>
>>>>that you'll embedd trailing spaces into your data.
>>>>
>>>>Regards
>>>>HJR
>>>>

[...]
>
> Don't take it personal, Chris, but I was trying to knock colsep. Any
> quick and dirty solution is going to have to cope with ampersands and
> commas and anything you can set colsep to, at least, and 8-bit
> characters and who-knows-what at worst. So you wind up eating dirt.
> Even old solid tools like exp can be befuddled by the odd stuff - I
> had a scare once when an import stripped the bit off a string that was
> actually being used as a numeric counter by a strange app.
>
> So I do another silly thing, and concatenate in separators that are
> _extremely_ unlikely to be in _my_ data, like @@@@@, and use awk and
> sed to get it in whatever format I want. Try putting some bizarro
> data in an Excel spreadsheet and export it to csv file and load it
> with sql*loader and then go the other way to get handy with the
> gotchas involved.
>
> And of course it is silly, because I actually do have some data that
> has @@@@@ in it, and didn't even notice it until I wrote this. It
> wasn't something that would have shot me in the foot until I did this
> on 1000 tables, but there you go. "Howard's way" is better, and the
> topic of Q&D is worth discussion.
>
> jg
> --

[...]

Here's a challenge: convert any table (common relational data types, excluding the more exotic ones such as RAW, LONG, xLOB, BFILE, ROWID, haven't tried TIMESTAMP) from Oracle into a CSV file that can, in turn, be imported into MS Access without error (including commas, newlines (Un*x or Windows style), quotes, punctuation, whatever the Oracle 9i and MS Access charset supports for my locale).

I know it can be done using Perl DBI and other Perl modules (slightly off-topic), but it works for what I think is a high percentage of real-world data.

Bold claim: I don't think it's possible to code a SQL*Plus query that will correctly handle any VARCHAR2(4000) data with the above mentioned border test cases and convert it into a CSV file that MS Access (or Excel, for that matter) can read wihtout error.

Don't neglect byte length semantics vs. character semantics.

--Mark Bole Received on Thu Aug 05 2004 - 21:10:29 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US