| Oracle FAQ | Your Portal to the Oracle Knowledge Grid | |
Home -> Community -> Usenet -> c.d.o.server -> Re: UTL_FILE maximum record size problems
This can be addressed by using the overloaded version of utl_file.fopen...
in Oracle8i at least )
UTL_FILE.FOPEN (
location IN VARCHAR2, filename IN VARCHAR2, open_mode IN VARCHAR2,
This allows up to 32737 characters to be read...
I'm not sure which versions of Oracle support the overloaded function though
--
Cheers
Steve Bright
steve.bright_at_capgemini.co.uk
smfrank_at_my-deja.com wrote in message <7j7ef2$vs$1_at_nnrp1.deja.com>...
>I am developing a package that reads in a flat file and and puts the
>contents into one of two different tables. We have tried SQL*Loader to
>load the files into the tables with no success, because of the bizarre
>formatting of the datafile(multiple record types with no unique record
>identifiers). So I am parsing the data with UTL_FILE. I ran into a
>problem today. Most of the records are under 100 characters.
>Occasionally there will be an exceptionally long record with 1200 plus
>characters. UTL_FILE.GET_LINE will only read 1022 characters. Then ever
>the program hits one of these records an EXCEPTION is raised and
>processing of the file ceases. Can anybody suggest some way around
>this? I wanted to originally just send the long records to a log, but
>I can even do because without GET_LINE I cant read them. Please help.
>
>Thank you
>Steve Marshmallo
>
>
>Sent via Deja.com http://www.deja.com/
>Share what you know. Learn what you don't.
Received on Fri Jun 04 1999 - 03:07:42 CDT
![]() |
![]() |