Re: Optimising Oracle 9i for million-record JDBC read-only access

From: Jim Kennedy <kennedy-downwithspammersfamily_at_attbi.net>
Date: Wed, 14 Jul 2004 00:40:13 GMT
Message-ID: <Nh%Ic.72781$IQ4.51906_at_attbi_s02>


"Howie Goodell" <howie.goodell_at_gmail.com> wrote in message news:20bf32b7.0407131149.4e164369_at_posting.google.com...
> Hello --
>
> I am trying to optimise a JDBC connection and an Oracle 9i database
> for reading millions of records at a time to a 1 Gig PC running Win2K
> or XP. Two questions:
>
> 1. Does anyone have suggestions for optimising an Oracle 9i server
> (Enterprise Edition, Release 2) for networked read-only JDBC access
> with large return sets? With default settings MySQL reads 1M records
> 2.5 times faster than Oracle even on its first, non-cached execution.
>
> 2. Unrelated question: is there a way to read a text field via JDBC
> without creating a String object? The memory and GC overheads of
> creating millions of objects are becoming a problem, too.
>
> Thanks much!
> Howie Goodell
Try increasing the array fetch or batch fetch to like 100 or so. Also use prepared statements and reexecute. You can use the oci driver and it should be faster.
Jim Received on Wed Jul 14 2004 - 02:40:13 CEST

Original text of this message