Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: UTL_FILE maximum record size problems

Re: UTL_FILE maximum record size problems

From: Steve B <steve.bright_at_capgemini.co.uk>
Date: Fri, 4 Jun 1999 09:07:42 +0100
Message-ID: <7j81gi$9ug$1@taliesin.netcom.net.uk>


This can be addressed by using the overloaded version of utl_file.fopen...  in Oracle8i at least )

UTL_FILE.FOPEN (

   location     IN VARCHAR2,
   filename     IN VARCHAR2,
   open_mode    IN VARCHAR2,

   max_linesize IN BINARY_INTEGER)
  RETURN file_type;

This allows up to 32737 characters to be read...

I'm not sure which versions of Oracle support the overloaded function though

--
Cheers

Steve Bright

steve.bright_at_capgemini.co.uk

smfrank_at_my-deja.com wrote in message <7j7ef2$vs$1_at_nnrp1.deja.com>...
>I am developing a package that reads in a flat file and and puts the
>contents into one of two different tables. We have tried SQL*Loader to
>load the files into the tables with no success, because of the bizarre
>formatting of the datafile(multiple record types with no unique record
>identifiers). So I am parsing the data with UTL_FILE. I ran into a
>problem today. Most of the records are under 100 characters.
>Occasionally there will be an exceptionally long record with 1200 plus
>characters. UTL_FILE.GET_LINE will only read 1022 characters. Then ever
>the program hits one of these records an EXCEPTION is raised and
>processing of the file ceases. Can anybody suggest some way around
>this? I wanted to originally just send the long records to a log, but
>I can even do because without GET_LINE I cant read them. Please help.
>
>Thank you
>Steve Marshmallo
>
>
>Sent via Deja.com http://www.deja.com/
>Share what you know. Learn what you don't.
Received on Fri Jun 04 1999 - 03:07:42 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US