Home » SQL & PL/SQL » SQL & PL/SQL » UTL_FILE vs External Tables (merged by bb)
UTL_FILE vs External Tables (merged by bb) [message #379868] Thu, 08 January 2009 03:41 Go to next message
wmgonzalbo
Messages: 98
Registered: November 2008
Member
Hi Experts,

Please bear with me with my lengthy explanation, just want to make things clear.

I need to read a flat file (not sure yet about the format etc) from a directory and then eventually from the given fields of the data, I will process them, do computations etc. and write it in an output file.

Initially I'm planning to:

* read them line by line
* store them in a collection or array or variable etc.
* process them, do computations etc.
* write/append to a file output.

I tried it using UTL_FILE.FOPEN and UTL_FILE.GET_LINE. Here's a sample code.

set serveroutput on;

DECLARE

 targetFile utl_file.file_type; 
 fileRaw4  VARCHAR2(50 BYTE) := 'test_file1.csv'; <-- or [B]test_file2.csv[/B]  v_buffer varchar2 (32767 BYTE);

BEGIN

  targetFile := utl_file.fopen ('TEST_DIR1_ORA', fileRaw4 ,'R');
  utl_file.get_line (targetFile,v_buffer);
  dbms_output.put_line (v_buffer);   
  utl_file.fclose(targetFile);
  
END;


It works if my test_file1.csv is only a small data (71 bytes) but if my test input is test_file2.csv (101001 bytes) I encounter this error:

ORA-29284: file read error
ORA-06512: at "SYS.UTL_FILE", line 98
ORA-06512: at "SYS.UTL_FILE", line 656
ORA-06512: at line 18


Is my input file too large for the buffer or some other error occured? If its too large, any ideas on how I can read my file part by part (average data is about 10MB++)?

/*****/

After doing some research in this forum, I encountered that some are stating that for reading large data it is faster and easier to use External tables (haven't tried it yet).

Will it be really faster to import my data first into a table then proceed processing my data than just to read line by line from a file and immediately process them?

How and Can I access then process individual fields in this external tables?

The use of External table is still not clear to me, I would appreciate it if you can provide useful information or links for this.

I am using TOAD. It has a tool for reading excel, .csv files, but I need to read and process all of this inside a procedure.

Hope my explanations are clear and alternative ideas or tips would be really appreciated. Good day. Smile

Thanks in advance,
Wilbert
Re: UTL_FILE vs External Tables [message #379871 is a reply to message #379868] Thu, 08 January 2009 03:47 Go to previous messageGo to next message
Michel Cadot
Messages: 64107
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Quote:
reading large data it is faster and easier to use External tables

Yes it is.

Quote:
Will it be really faster to import my data first into a table then proceed processing my data than just to read line by line from a file and immediately process them?

An external table is a pointer to the external file, it does not load anything in an Oracle table.

Quote:
The use of External table is still not clear to me, I would appreciate it if you can provide useful information or links for this.

http://www.oracle.com/pls/db102/search?remark=quick_search&format=ranked&word=external+table&tab_id=

Regards
Michel
Re: UTL_FILE vs External Tables [message #379873 is a reply to message #379868] Thu, 08 January 2009 03:51 Go to previous messageGo to next message
S.Rajaram
Messages: 1027
Registered: October 2006
Location: United Kingdom
Senior Member
My thoughts on this.

Check this link for more information about utl_file

http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABDEJDH

Quote:

ORA-29284: file read error


Since you have not mentioned any linesize when you open the file, it is defaulted to 1024. If you encounter any line more than 1024 characters you will encounter this error. To overcome this error you need to mention the maxlinesize (max allowed is 32767).

Moreover for your requirement I will be inclined to use the external table. It is more simple and easy to use. You can treat your file just like another table and you can harness the power of sql. Since you want to output the enriched information to another file I will use utl_file. If you are really fancy/techie enough I will use Pro*C. Reason behind is it is very fast and efficient.


Hope this helps.

Regards

Raj
Re: UTL_FILE vs External Tables (merged) [message #379874 is a reply to message #379868] Thu, 08 January 2009 03:54 Go to previous messageGo to next message
boukerker
Messages: 11
Registered: January 2008
Location: Paris
Junior Member
Limit your buffer size using get_line(file , buffer, size);
Regards.
Re: UTL_FILE and External Tables [message #380007 is a reply to message #379868] Thu, 08 January 2009 19:18 Go to previous messageGo to next message
BlackSwan
Messages: 25036
Registered: January 2009
Location: SoCal
Senior Member
>* read them line by line
>* store them in a collection or array or variable etc.
>* process them, do computations etc.
>* write/append to a file output.

OK, at no point in this task/effort/procedure does the data ever reside in any permentent form within the database.

So, forgive me, but why are (ab)using Oracle to manipulate the data?

IMO, you are using the wrong tool (Oracle) to diddle the data.
You need to read OS files, munch the data, & create new OS files.
At no point does any inherent RDBMS capability come into play.
This type of processing could be done with any shell script, Pearl, etc. purely at the Operating System level.

[Updated on: Thu, 08 January 2009 19:19]

Report message to a moderator

Re: UTL_FILE and External Tables [message #380013 is a reply to message #379868] Thu, 08 January 2009 19:58 Go to previous messageGo to next message
wmgonzalbo
Messages: 98
Registered: November 2008
Member
When I posted this thread it said that my session time ended, never noticed that it created 2 posts of this topic when I refreshed my session.

Anyway, yeah you have a point. Actually this is just an external program for the newly bought billing system (Using Oracle, java and UNIX) and for another assigned project.

I was asked to use PL/SQL, so for now I'm learning how to effieciently read large data from a DIR in UNIX (still a newbie in PL/SQL). From our last meeting it is still not yet final if I have to dump the results to a table, generated another file in a directory, or both etc. well logging of transactions in a table is of course needed.

Still working on this, with the help of other suggestions in my 'other' thread. All of your suggestions are very helpful.

Regards and Thanks,
Wilbert
Re: UTL_FILE and External Tables [message #380027 is a reply to message #379868] Thu, 08 January 2009 21:04 Go to previous messageGo to next message
BlackSwan
Messages: 25036
Registered: January 2009
Location: SoCal
Senior Member
IMO, Java would be a better language & it could run from either inside the database or as an external (standalone) program.

IMO, UTL_FILE is not a very efficient implementation & may not be able to scale sufficiently.


Re: UTL_FILE vs External Tables (merged) [message #380031 is a reply to message #379868] Thu, 08 January 2009 21:16 Go to previous messageGo to next message
wmgonzalbo
Messages: 98
Registered: November 2008
Member
Hi,

Thanks for all of the suggestions and comments. I tried including the maximum bytes (size) to read for UTL_FILE, still the same read error with my very large test data input (Test_File2.csv).

I am trying to use External Tables (thanks for the link Michel C.). Already created an External Table and DIR path yet I'm having some problems. Here's my sample code:


CREATE TABLE TEST_TABLE1
(
  ID                  VARCHAR2(50 BYTE),
  A_PARTY_ID          VARCHAR2(50 BYTE),
  .
  .
  .
 )
ORGANIZATION EXTERNAL
  (  TYPE ORACLE_LOADER
     DEFAULT DIRECTORY TEST_DIR1_ORA
     ACCESS PARAMETERS 
       ( records delimited by newline  
         fields terminated by ',' 
         missing field values are null 
        )
     LOCATION (TEST_DIR1_ORA:'Test_File2.csv')
  )
REJECT LIMIT UNLIMITED
PARALLEL ( DEGREE DEFAULT INSTANCES DEFAULT )
NOMONITORING;


Directory:
CREATE OR REPLACE DIRECTORY 
TEST_DIR1_ORA AS 
'/home/.... my path here ..../TEST_DIR1';

GRANT READ, WRITE ON DIRECTORY  TEST_DIR1_ORA TO MY_SHEMA WITH GRANT OPTION;

GRANT READ, WRITE ON DIRECTORY  TEST_DIR1_ORA TO SYSTEM WITH GRANT OPTION;


Sample Code:

set SERVEROUTPUT ON;

DECLARE

 v_buffer varchar2 (50 BYTE);

BEGIN

 select A_PARTY_ID into v_buffer from TEST_TABLE1 where rownum = 1;
 dbms_output.put_line (v_buffer);

END;

ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: unable to open log file TEST_TABLE1_25454.log
 OS error Permission denied
ORA-06512: at "SYS.ORACLE_LOADER", line 19
ORA-06512: at line 7


Seems like I am not doing this properly. OS error permission denied? Do I need further access grants in Oracle and UNIX? Again, thanks for the time and suggestions.

Regards,
Wilbert

Re: UTL_FILE vs External Tables (merged) [message #380055 is a reply to message #380031] Thu, 08 January 2009 23:36 Go to previous messageGo to next message
Barbara Boehmer
Messages: 8627
Registered: November 2002
Location: California, USA
Senior Member
The error message indicates that Oracle has insufficient operating system privileges to open the log file. There are two potential issues here.

Sometimes, in some versions, even though you have specified a default directory, it tries to write the log file to a different directory. You can try using NOLOGFILE, but then you have no log file, or you can specify the logfile directory explicitly in the same manner that you specified the directory for the data file location.

You need to make sure that the operating system user that Oracle was installed under has sufficient operating system privileges to the directory, which must be on your server, not a client machine. As a test, see if you can access that directory using the host command from SQL*Plus.

Getting past initial problems with operating system privileges to the directory is usually the biggest problem that people have with external tables. After that it is easy. You can just select from the external table as you would from any other table. This should be a lot easier than utl_file.
Re: UTL_FILE vs External Tables (merged) [message #380082 is a reply to message #380031] Fri, 09 January 2009 01:08 Go to previous messageGo to next message
Barbara Boehmer
Messages: 8627
Registered: November 2002
Location: California, USA
Senior Member
There is one more thing that it occurred to me that might need clarification. When you create an Oracle directory object, it accepts anything for the path and does not attempt to verify it until you try to use it. Creating an Oracle directory object using the syntax, "create or replace directory" does not create a directory on your operating system. You have to do that separately. So your /home/.... my path here ..../TEST_DIR1 operating system directory must exist for your Oracle TEST_DIR1_ORA directory object to point to that path.


Re: UTL_FILE vs External Tables (merged) [message #380085 is a reply to message #380031] Fri, 09 January 2009 01:23 Go to previous messageGo to next message
Barbara Boehmer
Messages: 8627
Registered: November 2002
Location: California, USA
Senior Member
Here is an example for you:

-- create directory in operating system:
SYS@orcl_11g> HOST MD c:\test_dir1


-- create Oracle directory object and grant privileges:
SYS@orcl_11g> CREATE OR REPLACE DIRECTORY TEST_DIR1_ORA AS 'c:\TEST_DIR1'
  2  /

Directory created.

SYS@orcl_11g> GRANT READ, WRITE ON DIRECTORY TEST_DIR1_ORA TO scott
  2  /

Grant succeeded.


-- contents of c:\test_dir1\test_file2.csv:
id1,a_party_id1
id2,a_party_id2


-- create table:
SYS@orcl_11g> CONNECT scott/tiger
Connected.
SCOTT@orcl_11g> 
SCOTT@orcl_11g> CREATE TABLE TEST_TABLE1
  2  (
  3    ID		   VARCHAR2(10),
  4    A_PARTY_ID	   VARCHAR2(15)
  5   )
  6  ORGANIZATION EXTERNAL
  7    (  TYPE ORACLE_LOADER
  8  	  DEFAULT DIRECTORY TEST_DIR1_ORA
  9  	  ACCESS PARAMETERS
 10  	    ( records delimited by newline
 11  	      fields terminated by ','
 12  	      missing field values are null
 13  	     )
 14  	  LOCATION (TEST_DIR1_ORA:'Test_File2.csv')
 15    )
 16  REJECT LIMIT UNLIMITED
 17  /

Table created.


-- select from table:
SCOTT@orcl_11g> SELECT * FROM test_table1
  2  /

ID         A_PARTY_ID
---------- ---------------
id1        a_party_id1
id2        a_party_id2

SCOTT@orcl_11g>

Re: UTL_FILE vs External Tables (merged by bb) [message #380106 is a reply to message #379868] Fri, 09 January 2009 03:14 Go to previous messageGo to next message
wmgonzalbo
Messages: 98
Registered: November 2008
Member
Hi all,

First of all Big Thanks to Barbara for the comments and samples! Thanks for the clarifications. Yes, my path is existing in my OS. So far, my test_program using UTL_FILE and my directory as TEST_DIR1_ORA works. Just having problems for large data (with my size paramater set to 32767), a file read error occurs.

Tried creating an external tables using the same directory TEST_DIR1_ORA for my small and large data samples (.CSV) but still to no avail, same error.

ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: unable to open log file TEST_TABLE1_2846.log
 OS error Permission denied
ORA-06512: at "SYS.ORACLE_LOADER", line 19
ORA-06512: at line 7


So I'm still checking the UNIX and Oracle privileges...

Regards,
Wilbert
Re: UTL_FILE vs External Tables (merged by bb) [message #380121 is a reply to message #380106] Fri, 09 January 2009 03:50 Go to previous messageGo to next message
Michel Cadot
Messages: 64107
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Quote:
First of all Big Thanks to Barbara for the comments and samples! Thanks for the clarifications.

But you didn't read or understand the following:
Quote:
Sometimes, in some versions, even though you have specified a default directory, it tries to write the log file to a different directory. You can try using NOLOGFILE...

Regards
Michel
Re: UTL_FILE vs External Tables (merged by bb) [message #380254 is a reply to message #380106] Fri, 09 January 2009 14:05 Go to previous messageGo to next message
Barbara Boehmer
Messages: 8627
Registered: November 2002
Location: California, USA
Senior Member
You can use NOLOGFILE to test if writing to the logfile directory is the problem:

CREATE TABLE TEST_TABLE1
(
  ID                  VARCHAR2(10),
  A_PARTY_ID          VARCHAR2(15) 
 )
ORGANIZATION EXTERNAL
  (  TYPE ORACLE_LOADER
     DEFAULT DIRECTORY TEST_DIR1_ORA
     ACCESS PARAMETERS 
       ( records delimited by newline  
         NOLOGFILE
         fields terminated by ',' 
         missing field values are null 
        )
     LOCATION (TEST_DIR1_ORA:'Test_File2.csv')
  )
REJECT LIMIT UNLIMITED
/


If that gets you past that error, then you can change it to explicitly specify the logfile and logfile directory, so that you have a log file to use to diagnose any further problems:

CREATE TABLE TEST_TABLE1
(
  ID                  VARCHAR2(10),
  A_PARTY_ID          VARCHAR2(15) 
 )
ORGANIZATION EXTERNAL
  (  TYPE ORACLE_LOADER
     DEFAULT DIRECTORY TEST_DIR1_ORA
     ACCESS PARAMETERS 
       ( records delimited by newline  
         logfile TEST_DIR1_ORA:'log.txt'
         fields terminated by ',' 
         missing field values are null 
        )
     LOCATION (TEST_DIR1_ORA:'Test_File2.csv')
  )
REJECT LIMIT UNLIMITED
/

Re: UTL_FILE vs External Tables (merged by bb) [message #380408 is a reply to message #379868] Sun, 11 January 2009 20:50 Go to previous messageGo to next message
wmgonzalbo
Messages: 98
Registered: November 2008
Member
Hi Experts,

Thanks alot, NOLOGFILE worked. But I guess I still have problem with permissions if I need to generate a logfile, I encounter this error when performing a simple query:

CREATE TABLE TEST_TABLE1
.
.
.

ORGANIZATION EXTERNAL
  (  TYPE ORACLE_LOADER
     DEFAULT DIRECTORY TEST_DIR1_ORA
     ACCESS PARAMETERS 
       ( records delimited by newline  
         logfile TEST_DIR1_ORA:'log.txt'
         fields terminated by ',' 
         missing field values are null 
        )
     LOCATION (TEST_DIR1_ORA:'Test_File2.csv')
  )
REJECT LIMIT UNLIMITED
/

-----

select * from TEST_TABLE1

-----

ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: unable to open log file log.txt
 OS error Permission denied
ORA-06512: at "SYS.ORACLE_LOADER", line 19
ORA-06512: at line 7


Again, Thank you.

Regards,
Wilbert

[Updated on: Sun, 11 January 2009 20:52]

Report message to a moderator

Re: UTL_FILE vs External Tables (merged by bb) [message #380438 is a reply to message #380408] Sun, 11 January 2009 23:27 Go to previous messageGo to next message
Barbara Boehmer
Messages: 8627
Registered: November 2002
Location: California, USA
Senior Member
One of the things that I like to do is create the operating system directory from SQL*Plus using the HOST command. That way, since the directory was created through Oracle, you know Oracle can access the directory.

Re: UTL_FILE vs External Tables (merged by bb) [message #380557 is a reply to message #380408] Mon, 12 January 2009 05:49 Go to previous messageGo to next message
S.Rajaram
Messages: 1027
Registered: October 2006
Location: United Kingdom
Senior Member
Well it's a pure guess. Could you check whether the oracle/listener process is having a write privilege on the directory you are trying to create the log ?

Regards

Raj
Re: UTL_FILE vs External Tables (merged by bb) [message #380652 is a reply to message #379868] Mon, 12 January 2009 21:19 Go to previous message
wmgonzalbo
Messages: 98
Registered: November 2008
Member
Can now, load data to external table, can or dont generate logs, read data from external table process and write to an output file.

Thank you all for your guidance and time. God bless and good day! Surprised

Regards,
Wilbert
Previous Topic: replace space with *
Next Topic: get the Trigger script from SQL*plus
Goto Forum:
  


Current Time: Mon Dec 05 04:38:37 CST 2016

Total time taken to generate the page: 0.09691 seconds