Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.server -> UTL_FILE maximum record size problems
I am developing a package that reads in a flat file and and puts the
contents into one of two different tables. We have tried SQL*Loader to
load the files into the tables with no success, because of the bizarre
formatting of the datafile(multiple record types with no unique record
identifiers). So I am parsing the data with UTL_FILE. I ran into a
problem today. Most of the records are under 100 characters.
Occasionally there will be an exceptionally long record with 1200 plus
characters. UTL_FILE.GET_LINE will only read 1022 characters. Then ever
the program hits one of these records an EXCEPTION is raised and
processing of the file ceases. Can anybody suggest some way around
this? I wanted to originally just send the long records to a log, but
I can even do because without GET_LINE I cant read them. Please help.
Thank you
Steve Marshmallo
Sent via Deja.com http://www.deja.com/
Share what you know. Learn what you don't.
Received on Thu Jun 03 1999 - 21:42:44 CDT
![]() |
![]() |