Home » SQL & PL/SQL » SQL & PL/SQL » How to manipulate data read from file using utl_file package - 10g - xp sp3
How to manipulate data read from file using utl_file package - 10g - xp sp3 [message #409129] Fri, 19 June 2009 10:17 Go to next message
adusur
Messages: 36
Registered: June 2009
Location: Bhubaneswar
Member
Hi,
There is a requirement for me that,

i will get a file, in that data about a table(colomns) will be present.
I need to read the data from taht file and need to generate the script to create table for that data, and i need to append that script to the same file.
I am new to oracle technology.I know how to read and write data to files.
But i dont know how to manipulate the data using utl_file.

Please give me some idea to manipulate the read deta using utl_file package.
Thanks in advance,
Ravindra.
Re: How to manipulate data read from file using utl_file package [message #409132 is a reply to message #409129] Fri, 19 June 2009 10:36 Go to previous messageGo to next message
Kevin Meade
Messages: 2103
Registered: December 1999
Location: Connecticut USA
Senior Member
What is to know. Open up the manual and read about it. Do a google and look for examples.

The good news is that if you read/write files in any other language, it is the same with UTL_FILE. Instead of a cobol OPEN command, you use the UTL_FILE.FOPEN command, etc. etc. So since you know how to do this in other languages, you should be all set. Sequential file processing is sequential file processing.

The one thing to remember is that Oracle will need access to the directory(s) where the file(s) are stored. So you may sometime get a file open error and the issue will not be that the file does not exist, but rather that you failed to have ORacle set up to be allowed to read the directory where the file resides. For more info. contact your DBA who should know how to do this.

Good luck, Kevin
Re: How to manipulate data read from file using utl_file package [message #409145 is a reply to message #409129] Fri, 19 June 2009 12:18 Go to previous message
andrew again
Messages: 2577
Registered: March 2000
Senior Member
This counds like a really fuzzy requirement. Assuming you know what field delimiter is used, you probably wouldn't have any garantee of what the datatypes for each field are without testing every single value in each column. One way you could approach it is to just treat everything a character data and load into generic tables with enough columns to cater for max fields you would get in any file.

After that, you can easily "profile" each generic column to see what the datatype is, it's nullability, max string lengths etc. Based on the outcome of that you could determine a table with the min number of columns to satisfy that file's data, the datatypes and nullability of the columns. You'd probably test in this order date --> number --> string

-- flex format date converter
http://examples.oreilly.com/oraclep3/individual_files/datemgr.pkg

-- is number?
CREATE OR REPLACE  FUNCTION tonumber (p_value IN VARCHAR2)
   --RETURN BOOLEAN
   RETURN VARCHAR2
   IS
      val NUMBER;
   BEGIN
      val := TO_NUMBER (p_value);
   -- RETURN TRUE;
      RETURN 'Y';
   EXCEPTION
      WHEN VALUE_ERROR
      THEN
   --   RETURN FALSE;
        RETURN 'N';
   END tonumber;

-- you can even test for XML of whatever
CREATE OR REPLACE FUNCTION is_xml (p_str IN VARCHAR2)
   RETURN VARCHAR2
IS
   l_xml_dummy   XMLTYPE;
   bad_xml       EXCEPTION;
   PRAGMA EXCEPTION_INIT (bad_xml, -31011);
BEGIN
   l_xml_dummy := SYS.XMLTYPE.createxml (p_str);
   RETURN 'Y';
EXCEPTION
   WHEN bad_xmlTHEN
     RETURN 'N';
END;



Previous Topic: Call HOST Command in Pl/Sql (merged)
Next Topic: Index concept
Goto Forum:
  


Current Time: Mon Nov 04 12:08:51 CST 2024