Home » SQL & PL/SQL » SQL & PL/SQL » how to skip duplicate records in a procedure from processing
how to skip duplicate records in a procedure from processing [message #214736] Wed, 17 January 2007 18:40 Go to next message
amankhan
Messages: 69
Registered: December 2005
Location: Texas
Member
Hi All,

I have duplaicate records in my table. In my procedure, I have to access only one row and leave the remaining rows unprocessed.

For ex:-

EMPNO ENAME DEPTNO DNAME SAL
1128 SMITH 140 MARKT 200
1128 SMITH 140 MARKT 400
1128 SMITH 140 MARKT 600
1160 DARON 150 BUSSN 300
1160 DARON 150 BUSSN 500
1196 DAYLE 180 SPORT 800

In my procedure, I need to process only

1128 SMITH 140 MARKT 200
1160 DARON 150 BUSSN 300
1196 DAYLE 180 SPORT 800

only these rows and dont process the remaining rows. Instead of deleteing the duplicate rows and then accessing, how to handle this duplication in the procedure. will write a cursor and open it and process the first row. when the cursor fetches the second row, as it a duplicate interms of EMPNO, i dont want to process that and skip to 4th record. process 4th record and skip 5th as it is a duplicate and then process 6th.

Will write some validations to check whether EMPNO and DEPTNO exists in the database and and whether those two are linked to each other and then process it.

Help Appreciated how to handle it in a procedure.

Thanks


Re: how to skip duplicate records in a procedure from processing [message #214741 is a reply to message #214736] Wed, 17 January 2007 19:33 Go to previous messageGo to next message
rleishman
Messages: 3724
Registered: October 2005
Location: Melbourne, Australia
Senior Member
If you never actually get any duplicates, but you just want to make sure because uniqueness is not enforced, you could sort the cursor by the key, compare the key on each row to the previous row, and bypass processing if the keys are the same.

Alternatively, you could do it in the cursor, which would be more efficient if there are a large number of duplicates, or if you want to use BULK COLLECT / FORALL.
SELECT *
FROM tab
WHERE ROWID IN (
  SELECT min(ROWID) KEEP (DENSE_RANK FIRST ORDER BY keycol)
  FROM   tab
  GROUP BY keycol
)


Ross Leishman
icon14.gif  Re: how to skip duplicate records in a procedure from processing [message #214765 is a reply to message #214741] Thu, 18 January 2007 00:26 Go to previous messageGo to next message
fortunethiyagu
Messages: 94
Registered: December 2006
Member

You can also use DISTINCT option to filter duplicates. Cool

select 
 distinct
   EMPNO,
   ENAME,
   DEPTNO,
   DNAME,
   SAL
 from employee;


- thiyagarajan palanisamy

[Updated on: Thu, 18 January 2007 00:26]

Report message to a moderator

Re: how to skip duplicate records in a procedure from processing [message #214843 is a reply to message #214765] Thu, 18 January 2007 04:36 Go to previous messageGo to next message
JRowbottom
Messages: 5933
Registered: June 2006
Location: Sunny North Yorkshire, ho...
Senior Member
Unfortunately, as you can see from the OP, the records are duplicates on all but one field, and so a DISTINCT will bring back the entire set of records.
Re: how to skip duplicate records in a procedure from processing [message #214878 is a reply to message #214736] Thu, 18 January 2007 05:48 Go to previous message
bhupinderbs
Messages: 67
Registered: July 2006
Location: Bangalore
Member
Hi Dost!

Here are another options to your query's answer


CREATE TABLE MASTER1
(
MS1_COL1 NUMBER(5),
MS1_COL2 VARCHAR2(50)
)
/
INSERT INTO MASTER1 VALUES(10,'Master 1 values')
/
INSERT INTO MASTER1 VALUES(10,'Master 1 values')
/
INSERT INTO MASTER1 VALUES(20,'Master 1 values two')
/
INSERT INTO MASTER1 VALUES(20,'Master 1 values two')
/
COMMIT;
/* Option 1 */
select * from master1 a
where a.rowid IN (select max(b.rowid) from master1 b
where a.MS1_COL1 = b.MS1_COL1)
/
/* Option 2 */
select distinct MS1_COL1,MS1_COL2 from master1
/

Bye!!
Bhupinder Singh
Previous Topic: Import/Export data from/to text file
Next Topic: how to use decode function in for loop
Goto Forum:
  


Current Time: Fri Dec 02 14:20:54 CST 2016

Total time taken to generate the page: 0.13173 seconds