Re: Large dataset performance

From: Cimode <>
Date: 21 Mar 2007 02:31:24 -0700
Message-ID: <>

On Mar 20, 10:53 pm, "jma" <> wrote:
> Hello Cimode,
> first let me say that you're right I am far away from being literate
> in SQL.
Then you know what to do. Understand better set operations on SQL and read books about relational model that will help you.

Introduction to database systems - CJ DATE

> However, consider that I am simulating the way a client
> application would be writing data to the database and this cannot
> change... I would really love to read directly from a binary stream
> into a db table, so that the performance would be adequate. Such a
> solution would fit me very well, since my data have a persistent
> pattern, for example int,double,double,double. Anyway, thanks for your
> remark
Any practice or implementation should first be guided by a good grasp of underlying principles. The chances of success in building decent systems without knowledge is close to NULL (One *may* get lucky sometimes).

Hope this helps...

> BR
> jma
> / Cimode :
> > On 20 mar, 16:39, "jma" <> wrote:
> > <<I would like your opinion and experience on the performance of
> > writing
> > large datasets. I am writing in one loop about 3.5 million rows where
> > each row is an integer and 3 doubles to an Interbase db. All in all
> > it's about 100MB. My time measurement was 210 seconds. Is this normal?
> > To me it appears as a veeerryyy long time.... >>
> > The principle reason I see is the *Looping* algirythmics which is not
> > what a db does best. I suggest you learn better the power of set
> > operation through a better mastery of good ol SQL...Hope this helps...
Received on Wed Mar 21 2007 - 10:31:24 CET

Original text of this message