Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Re: Need help : degrading performance in record updates

Re: Need help : degrading performance in record updates

From: <romeo.olympia_at_gmail.com>
Date: 14 May 2007 02:07:33 -0700
Message-ID: <1179133653.094155.319670@p77g2000hsh.googlegroups.com>


On May 14, 1:49 pm, Akshay..._at_gmail.com wrote:
> On May 13, 5:10 pm, Frank van Bortel <frank.van.bor..._at_gmail.com>
> wrote:
>
>
>
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
>
> > EscVector schreef:
>
> > > Well, you did ask... :) Sorry!
>
> > Just being grumpy - sorry.
>
> > > My question would be do they mean 100k*25.
>
> > That would be my understanding, now that I know they meant
> > 2,500,000 records. Which is nothing, really, and loads
> > on my 5 year old PC in about 5 minutes.
>
> > - --
> > Regards,
> > Frank van Bortel
>
> > Top-posting is one way to shut me up...
> > -----BEGIN PGP SIGNATURE-----
> > Version: GnuPG v1.4.1 (MingW32)
>
> > iD8DBQFGRwAiLw8L4IAs830RAv/SAJ9HxBh4C3rG6CgMrfsozxtwCY2ERACdGedl
> > bruZtHlDhNatQAF6TH8/LiI=
> > =PRGb
> > -----END PGP SIGNATURE-----
>
> Hello All,
>
> first of all my apologies for the "Lakh" thing.
>
> I need to use the batch program as I need to process the text file
> data to upload in the db.
> Frank, I am not a oracle person but a Java programmer.
> On the program side I have tried a couple of things to my
> understanding. Rechecked for any memory leaks possible, thoroughly.
> Tightened the code related to file reading.
>
> As for "Error02 : Commit slow"... I did not get your point.... do you
> say that, commits should be more frequent?
>
> jg, yes, I am committing in a loop.
>
> yes, 100k*25 is the records per file.
>
> I connecting through a Dual Core 2.99GHz 1GB machine to another Dual
> Core 2.99GHz 1GB machine, both running WinXP Professional. And
> Oracle9i is the database. I just dont understand what could be the
> bottleneck :-(.
>
> Thank you in advance for any help/directions.
>
> :-)

Some comments/suggestions/ideas:

  1. Use sqlload if you can. When it comes to bulk loads, this should be your first option. Even if it's not a very structured input file, you can, in most cases, still do some logical record conversions. Of course, you'll need to check the doc or ask a colleague who knows the tool.
  2. If you really can't use sqlload (why not?) and need to stick to the java program, then make sure you're using bind variables. I hope you're not dynamically building your insert statement (including the actual values) for each and every record. This is a sure way to kill your performance.
  3. If #2 above is not the cause, then do send us more info. statspack, tkprof/trace, the works.

HTH. Romeo Received on Mon May 14 2007 - 04:07:33 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US