Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Convert CSV data held in a CLOB into columns

Re: Convert CSV data held in a CLOB into columns

From: Mark D Powell <Mark.Powell_at_eds.com>
Date: Fri, 7 Dec 2007 17:50:38 -0800 (PST)
Message-ID: <93697bc7-3f2d-4230-992e-db710f35ae57@y43g2000hsy.googlegroups.com>


On Dec 7, 5:05 pm, jeremy <jeremy0..._at_gmail.com> wrote:
> Oracle 10gR2
>
> Would like to be able to support upload of a CSV file and then populate
> a "staging" table with its contents.
>
> - use first line of file as column names
> - load lines 2 to end of file as rows into a table, populating only
> those columns identified in the header row
>
> The thought was we could load the file into a BLOB (I should mention
> we're using OHS / mod_plsql so this is standard file upload)), convert
> it to CLOB and then (somehow) parse the content.
>
> Antbody any pointers to examples where this has been done before?
>
> Other suggestions? Have thought about external tables - perhaps that is
> a better approach?
>
> --
> jeremy

What is the primary reason you want to be able to parse the column names and then receive the data in one file? For a target why can't missing data values just be missing as in two delimiters in a row with no value between them?

One idea to consider might be to define a generic load table (external) where every column is varchar2(4000) and then just use sql to convert each column to the proper target table type.

If you open the select on the external table as a cursor you could select the first line and use the column names (and maybe target table name) to build a dynamic sql insert statement string with bind variables. Then for cursor rows 2 to EOF you just fire the dynamic insert.

Unless the data source does a good job formatting and verifying the data handling errors can be a problem with overly generic approaches.

You should also consider just defining a sqlldr control file for each expected external file that you are going to want to load into a table or holding (processing) table.

Perl and Pro*C could be used at the OS level to parse the file and submit the records to the database via an insert if you need to control the timing of the task execution outside the db. The external table would be good when you want to trigger the load from within a db task.

HTH -- Mark D Powell --           Received on Fri Dec 07 2007 - 19:50:38 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US