Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Mailing Lists -> Oracle-L -> UTL_FILE performance

UTL_FILE performance

From: Niall Litchfield <n-litchfield_at_audit-commission.gov.uk>
Date: Tue, 17 Feb 2004 16:52:12 -0000
Message-ID: <CFECD6EA683C524BB5FD22C0F1F34ACB06B11E@bristol43.audit-commission.gov.uk>


Looping through a 36mb logfile and inserting some parsed information into a table seems somewhat slow on my laptop. I'm hoping that this is because of my poor coding (not being a developer) rather than because UTL_FILE is the wrong tool.=20

I have a log file that looks like

=3D=3D=3D=3D

Some header info which I am not interested in yet

TIME: ROWS: DESCRIPTION:



11:10:00 36 > some text
.
.
.
.
.

Time used: an incorrect elapsed time figure

Disconnection information

And I'm only interested in storing the timestamp, rows and description column in my table. The first 2 columns are nullable.=20

My code looks like

create or replace procedure split_logline(p_logline in varchar2,p_tstamp out varchar2,p_rows_affected out varchar2,p_description out varchar2) as
begin

	p_tstamp :=3D substr(p_logline,1,8);
	p_rows_affected :=3D substr(p_logline,9,6);
	p_description :=3D substr(p_logline,16,255);
end;
/

show errors

create or replace procedure read_asql_log(p_filename IN VARCHAR2,p_location in VARCHAR2)
as
/*
purpose: procedure to read asql log=20

requirements:	Directory object created for the log directory
				Read permissions on this directory.
				Table called asql_log, created as above

=20
parameters: name of asql log, name of bdump directory.=20 Version: NL 17/02/2004 - created from alert.log code =09 */ fHandle UTL_FILE.FILE_TYPE; strTstamp varchar2(8); strRows varchar2(6);

strDescription varchar2(255);
v_logline		VARCHAR2(4000);
v_tstamp		date;
v_rows_affected	number;
v_description	varchar2(255);

file_error		EXCEPTION;

PRAGMA EXCEPTION_INIT(file_error,-20100);

BEGIN

			DBMS_OUTPUT.ENABLE(1000000);
	begin
		fHandle :=3D
UTL_FILE.FOPEN(p_location,p_filename,'r',4000);
	exception
		when UTL_FILE.INVALID_OPERATION THEN -- probably
specified wrong location or file name
			RAISE_APPLICATION_ERROR(-20100,'Check asql Log
location and existence');
		when others then
			raise;
	end;

=09
begin -- scroll to start of data loop UTL_FILE.get_line(fHandle,v_logline);

=09

split_logline(v_logline,strTstamp,strRows,strDescription);
		exit when strTstamp =3D '--------';
	end loop;
	end;

=09
loop begin UTL_FILE.get_line(fHandle,v_logline);

=09

split_logline(v_logline,strTstamp,strRows,v_description);
			exit when strTstamp =3D 'Time Use';
			v_tstamp :=3D
to_date(trim(strTstamp),'HH24:MI:SS');
			v_rows_affected :=3D to_number(trim(strRows));
			insert into

asql_log(tstamp,rows_affected,description) values(v_tstamp,v_rows_affected,v_description);
		exception -- reached end of file
			when no_data_found then exit;
		end;
	end loop;
	commit;
	UTL_FILE.FCLOSE(fHandle);

end;
/

For those of you following c.d.o this is indeed very similar to my alert log example.=20

Now this works but for 600k lines in a 36mb file takes nearly 3 minutes to complete. Does this seem reasonable or have I just betrayed my pl/sql incompetence?

Niall Litchfield
Oracle DBA
Audit Commission
+44 117 975 7805=20
=20



This email contains information intended for the addressee only. It may be confidential and may be the subject of legal and/or
professional privilege. Any dissemination, distribution, copyright or use of this
communication without prior permission of the sender is strictly prohibited.


Please see the official ORACLE-L FAQ: http://www.orafaq.com

To unsubscribe send email to: oracle-l-request_at_freelists.org put 'unsubscribe' in the subject line.
--
Archives are at http://www.freelists.org/archives/oracle-l/
FAQ is at http://www.freelists.org/help/fom-serve/cache/1.html
-----------------------------------------------------------------
Received on Tue Feb 17 2004 - 10:52:12 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US