Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> comp.databases.theory -> Re: Large dataset performance

Re: Large dataset performance

From: Daniel <danielaparker_at_gmail.com>
Date: 21 Mar 2007 05:42:14 -0700
Message-ID: <1174480934.778316.188530@n59g2000hsh.googlegroups.com>


On Mar 20, 1:32 pm, "Cimode" <cim..._at_hotmail.com> wrote:
> On 20 mar, 16:39, "jma" <junkmailav..._at_yahoo.com> wrote:
<<I would like your opinion and experience on the performance of
>> writing
>> large datasets. I am writing in one loop about 3.5 million rows where
>> each row is an integer and 3 doubles to an Interbase db. All in all
>> it's about 100MB. My time measurement was 210 seconds. Is this normal?
>> To me it appears as a veeerryyy long time....

> The principle reason I see is the *Looping* algirythmics which is not
> what a db does best. I suggest you learn better the power of set
> operation through a better mastery of good ol SQL...Hope this helps...
>

The OP's problem seems to be to load 3.5 million rows into a table. You might want to explain to the OP how he could accomplish this more efficiently with a set operation, as opposed to performing inserts inside a loop, or using a bulk load utility. The OP is unlikely to be ab able to find this information in a book.

Regards,
Daniel Parker Received on Wed Mar 21 2007 - 07:42:14 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US