Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Re: performance on delete the rows from a huge table

Re: performance on delete the rows from a huge table

From: DA Morgan <damorgan_at_x.washington.edu>
Date: Fri, 18 Mar 2005 20:58:40 -0800
Message-ID: <1111208121.401828@yasure>


linda_shi10_at_yahoo.com wrote:

> I am new to ORACLE. Please advise! Thanks.
> I have a table which contains about 20-30 million rows. I need to
> delete some 'duplicate' rows by certain rules. So I create a produce
> try to do the job.
>
> my query example:
>
> delete from A m
> where max(a.x5) is not in
> (select * from A n
> where m.x1 = n.x1 and
> m.x2 = n.x2 and
> m.x3 = n.x3 and
> m.x4 = n.x4 )
> ;
> I need to commit it every 2000 row is processed. Say rownum <= 2000.
>
> The problem is that the table A contains too many rows. It seems that
> it will take forever to do the job. How can I improve the query
> performance?
>
> Thanks,
> Linda

Far too complex.

  1. Design a constraint that will fail to enable when duplicates cause a constraint violation.
  2. Go to http://www.psoug.org click on Constraints scroll down to the Enable Constraints demo and its use of the EXCEPTIONS INTO clause.

Then delete by ROWID. There should not be a need for incremental commits on a well configured system deleting by ROWID.

-- 
Daniel A. Morgan
University of Washington
damorgan_at_x.washington.edu
(replace 'x' with 'u' to respond)
Received on Fri Mar 18 2005 - 22:58:40 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US