BULK Processing Process "Expires" without completing

From: Dereck L. Dietz <dietzdl_at_ameritech.net>
Date: Mon, 21 Jan 2008 02:16:10 GMT
Message-ID: <KfTkj.4356$jJ5.3021@newssvr11.news.prodigy.net>


Oracle 10.2.0.3.0
Windows Server 2003
Enterprise Edition

I have a process where I have a cursor which returns approximately 36 million rows which I have to check against certain criteria to load into 25 possible tables (so far).

I'm doing bulk reads (limit of 50000 and 25000 have both been used) and FORALL inserts. I've experienced this problem with both limits - a limit of 25,000 seems to take a bit longer to complete (when it does) than 50,000 but they both have had problems with the process disappearing.

After the bulk inserts I have 25 lines similary to: nested_table_name.DELETE to free up the memory before the it loops through and retrieves the next 25,000 or 50,000 rows.

It seems that whenever my process gets to where it has processed about 30 million rows it just stops processing.

I'm going to have more tables to load - I don't want to do what the current process does (which takes over a day to run) and read the source table over and over again for each table to load when I can read once write many times much quicker.

Is it possible that somehow I'm running out of memory and not completely releasing the memory with the "nested_table_name.DELETE"s I'm doing?

I know this may be construed as possible double posting since this ties in with my other post about processes expiring but I'm wondering if there is anything I can do in my package to reduce memory usage.

I should also point out that up to now nobody has tried to write packages against this database and I'm not completely sure the database is configured for using packages. There are other people who are writing PL/SQL code instead of just SQL scripts who are experiencing the same disappearing processes as I am.

Any help would be appreciated. Received on Sun Jan 20 2008 - 20:16:10 CST

Original text of this message