Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Developer 2000

Developer 2000

From: Colt5 <Colt5_at_interlog.com>
Date: 2000/05/02
Message-ID: <zspP4.101414$2D6.2582224@news20.bellglobal.com>#1/1

I don't know if this is server related enough, but here's my problem.

We currently deployed a new billing application using an Developer 2000 Forms and Reports runtime out to 700 machines. The database resides on a nice HP V Class server. The deployment was essentially running all Forms from an NT server.

When Clients connect, as they open more windows/forms they use more memory. When the window/form closes, it does not release the same amount of memory as it first consumed. After a period of time the memory is used up. Only closing completely out of the designed app does all the memory that the Forms/Reports windows used when it originally launch.

What we are trying to firgure out, is if its the way the connection is made, the way the Forms were designed, or something to do in the network client side.

(please reply online if possible, I keep my email address out of newsgroups due to spam)

Can anyone offer some suggestions ? Received on Tue May 02 2000 - 00:00:00 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US