Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Oracle recordset too big to place in MS Access 97 table

Oracle recordset too big to place in MS Access 97 table

From: Jay McAnally <jaymac3_at_aol.com>
Date: Tue, 16 Jun 1998 20:38:44 -0400
Message-ID: <6m73h5$3oq@news2.snet.net>


I am having a problem populating a local table with a large data set returned through a SQL Pass Through query against an Oracle 7.34 database over a NT 4.0 network. My workstation is a Compaq laptop with 48mg ram, and 2gig Hd.

 I open both the Pass Through and the local table (empty} in VBA, and the query executes fine, returning between 600-800 records per fetch. I loop through the local table for each returned query record, painting each cell with either returned or calculated data. (I know I could do this with a make table or append querydef, but this technique affords more control, and I likely would have the same problem with the alternate technique.)

 There are 8100 records in the full query result set. The local table contains 10 fields, 1 defined as a double, 8 defined as currency, and one datetime. The table is keyed on the double (ACCTNBR) field. The ACCTNBR is defined in Oracle as N,22,22,0 and the currency valuesare defined in Oracle as N,18,18,5. The datetime is generated by Now().

The process runs fine through several fetches, then precisely at record 4112 it appears to 'hang,' and eventually I get an ODBC Error which I believe is an ODBC Time Out. (I monitored the behavior several times through the debug window.)

Extending the ODBC Time Out property simple lengthens the 'hang' behavior, but it still times out. I also tried switching to ODBC Direct and increasing the querydef "Cache" property to 8292 rows. The query obviously takes a lot longer to execute initially, but still returns a dataset which it "paints" very quickly into the table -- up to record 4112 that is. I then get the same error message.

I can think of two possible causes The SQL Pass Through relies on several stored function calls in the Oracle database (which I create earlier in the process before I fire the query). It's possible that I'm violating some overhead barier on teh server side, but my Oracle techies say they are
"without sin." (They ALLWAYS are!)

 I suspect that I'm hitting some internal barrier in Jet/Access, like a buffer that's too small, and the excess results are just being dumped. When my process loop tries to go past the last record in the buffer, there's
"nobody home". If there is such a "buffer" constraint, can it be
configured and if so, can it be done "on the fly" in code? How?

The solution I'm building must work with a group of identical databases ranging from 1800 rows to 52,000 rows (and probably larger soon)

FYI, I routinely pull back larger datasets (10,000 rows +) from this database using the above strategy, however, they likely have smaller "per row" definitions.

Any suggestions will be appreciated

Please respond here or jmcanally_at_opensolutions.com Received on Tue Jun 16 1998 - 19:38:44 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US