Memory Problem? UTL_File

From: Kathy Graham <kathy_graham_at_hpl.hp.com>
Date: Sat, 06 Mar 1999 17:11:43 -0800
Message-ID: <36E1D24F.3043C1EE_at_hpl.hp.com>



Hi Folks;

    I am in the middle of trying to do an application cutover this weekend, and am running into some serious problems trying to load one of my tables. Several months ago, I wrote a PL/SQL procedure that uses UTL_FILE to read data from a text file, do some minor conversion and then load it into the resultant table. I had run it successfully then multiple times. The text data file is about 130,000 records. The first time I ran it, it bombed approximately 55,000 records in. No big deal, throw some debug statements in and move on, right?!!!! Well, here is the deal. EVERY time I run it, it bombs on a different record. Sometimes it makes farther than 55,000, other times it cannot get past 1000 records. If I keep pushing it, I will then get UTL_FILE erros saying that it does not have write access to the log file. (That wasn't a problem the last ten runs, and it goes away by stopping and starting SQL*PLUS.) This is obviously getting rather frustrating to pin down. Anyone ever run into a problem like this? I put it together using TOAD and am executing it from SQL*PLUS. (Is there a way to execute from Toad and get the resultant DBMS_OUTPUT messages?)

Any thoughts would be appreciated. (I am going to go split the text files into smaller ones and see if I can make it work for smaller files. That *should* work to get the data initially loaded, but I still need to track this down.) Thanks in advance.....

Kathy Graham
kathy_graham_at_nospam.hpl.hp.com Received on Sun Mar 07 1999 - 02:11:43 CET

Original text of this message