Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.misc -> Re: Need help : degrading performance in record updates
AkshayAOI_at_gmail.com schreef:
> We are facing some problem in our batch application.
> This java based application is reading a file with 25 lakh records and
> updating in Oracle DB.
Error 1: not using the tools provided - use SQL*Loader!
What a lakh record anyway?
> The batch is commited after each 300 records.
Error 2: committing is slow!
> We are taking a time stamp after each 10000 records.
So what?
>
> What we have observed is, initial it was taking some 3-4 seconds to
> update 10000 records.
Say what? What platform do you run on - a pentium I class machine
with 640kB memory?
> The time required is increasing progressively.
> After 12 Lakh records, now it is taking almost 10+ mins to update
> 10000 records.
>
> I have failed to pin down a problem, for such continuously degrading
> performance.
You have not done anything but trial and error. And post this.
> This is important for us because, we have such 16 files to upload. And
> this upload is just data preparation for the analysis that we need to
> perform on one bug that is observed on production.
>
> I think, Re-do logsize of Oracle should not be a problem since we are
> committing every 300 records.
>
-- Regards, Frank van Bortel Top-posting is one way to shut me up...Received on Fri May 11 2007 - 12:42:27 CDT
![]() |
![]() |