Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.misc -> Memory consumption when processing many records
We're using Oracle 10g and 11g on Windows. We have a client-server app
written in PowerBuilder. We do logic on the PBuilder side and with PL/
SQL.
We have a process that will loop through many records (about a
thousand). For each record, it must "process it" meaning do additional
queries across various tables with it, etc. Once done with a record,
it is considered done with that record and goes to the next.
We notice as that each record is processed, the memory consumption of the oracle database oracle.exe keeps increasing. Eventually it crashes the session by either disconnecting the session or giving something similar to an Ora-04031 error (usually the 04031 error itself).
My guess is that the as the record is processed, it consumes memory to do it. But then it never releases it before going to the next record. Is there a way to force it to do this?
I know this is a vague description of a scenario, but i'm hoping it's all that is needed. I hope the solution doesn't involve re-writing our code or increasing the amount of memory on the machine running the server. From what I see, it can process one record fine (actually, more than one, the effects don't bog down the oracle db machine until it hits the 300th record or so). I just want the memory to "start fresh" with each record or so. But it just seems to keep consuming more and more.
Something we notice... If we simply disconnect the sqlca object (the transaction object the PowerBuilder application uses to connect to the database) and then simply re-connect, say, every 100 records or so... the problem goes away. We simply disconnect, re-connect, and pick up at the point where we left off. This shows me the memory gets flushed every time the session is disconnected.
This is the effect that I want... for the memory to be flushed every so many records, so it can continue looping through each record in the resultset as if it were doing the first one each time. I understand there may be a performance impact as it flushes the memory for each record (or every hundred or so), but I'm willing to sacrifice that to keep it from running out of memory altogether. Received on Wed Nov 14 2007 - 16:19:20 CST
![]() |
![]() |