Home » SQL & PL/SQL » SQL & PL/SQL » Need help with Sql query
Need help with Sql query [message #301643] Thu, 21 February 2008 04:00 Go to next message
leelaraj
Messages: 3
Registered: February 2008
Junior Member
I have a table like this
==
create table test_101 (
item_id number,
group_id number);
==
and data like this
==
1001,1
1002,1
1001,2
1002,2
1001,null
1002,null
1002,3
1004,3
1005,4
1006,4
1003,null
1004,null
1007,null
1008,5
1009,5
==
I have a program that spawns multiple sub-processes and each sub-process needs to work on set of items where
- the items belonging to one group are not processed by different sub-processes
- all records belonging to same item are processed by same sub-process

So, I need sets like this

Item,Set
==========
1001,1
1002,1
1004,1
1005,2
1006,2
1008,3
1009,3
1003,4
1007,5

Is it possible to achieve this using a single sql query(with or without analytical functions)?

Appreciate your help.
Re: Need help with Sql query [message #301647 is a reply to message #301643] Thu, 21 February 2008 04:06 Go to previous messageGo to next message
Michel Cadot
Messages: 64102
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Quote:
Is it possible to achieve this using a single sql query

What do you mean?
Do you want a query that set a process number to each item?
You can use something like MOD(DENSE_RANK(item_id)...,<nb_processes>).

Regards
Michel
Re: Need help with Sql query [message #301655 is a reply to message #301647] Thu, 21 February 2008 04:39 Go to previous messageGo to next message
leelaraj
Messages: 3
Registered: February 2008
Junior Member
Hi Michel,

No. my requirement is not to set a process number for each item.

Instead, I want to set a unique number for each set of items such that a set has
- all Items belonging to one group (see table definition of test_101 and sample data)
- an item does not belong to more than one set.

For Ex: If The data is
==
Item,group
1001,2
1002,2
1002,3
1004,3
1001,null
==
then the items 1001,1002 & 1004 have to be processed by same sub-process and hence I want to give same set number to 1001,1002 & 1004.

Hope I am clear.
Re: Need help with Sql query [message #301789 is a reply to message #301643] Thu, 21 February 2008 14:07 Go to previous messageGo to next message
coleing
Messages: 213
Registered: February 2008
Senior Member
So in your latest example, item 1002 is in 2 groups.

Are you saying that in order to process item 1001, you get all items in the group (1002), and then because 1002 is also in group 3, you need all items in group 3 also (1004)
Re: Need help with Sql query [message #301790 is a reply to message #301789] Thu, 21 February 2008 14:12 Go to previous messageGo to next message
Michel Cadot
Messages: 64102
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
Yes, I think this is the case and this is why I moved it from newbie to expert forum.
I remember this "recursive grouping" has been asked somewhere (asktom or usenet) months (or more likely a couple of years) ago. I remember the answer was complex but unfortunatly I don't remember the solution.
All I can tell is that the solution exists and it is there on the web.

Regards
Michel
Re: Need help with Sql query [message #301826 is a reply to message #301789] Thu, 21 February 2008 22:08 Go to previous messageGo to next message
leelaraj
Messages: 3
Registered: February 2008
Junior Member
Hi Coleing,

Yes. Your understanding is correct.
Re: Need help with Sql query [message #301867 is a reply to message #301643] Fri, 22 February 2008 01:39 Go to previous messageGo to next message
coleing
Messages: 213
Registered: February 2008
Senior Member
Ok,

This is quite a complex solution.

Find all the groups which have items only in one group, and give each of those groups a set number.

See how many are left to decide on your next action. If there are few, could you consider just putting them all in the same processing group?

If the volume is large, I wont write the code(as im off to work now), but could you do this in principal?

record the first group number, and the count of groups they are in for each item, and a set number into a temp table (or even just an inline view).

Then, join the original set to this new set by item number. If the item number you are looking at has a different group number to the new temp one (first group for item), put all items in the current group into that original set number.
Re: Need help with Sql query [message #302439 is a reply to message #301643] Mon, 25 February 2008 12:08 Go to previous messageGo to next message
Frank_Zhou
Messages: 5
Registered: February 2008
Location: Braintree , MA
Junior Member
I think this might be the SQL solution for this question.
The solution for this problem is based on one of my SQL on the oraqa website.
http://oraqa.com/2007/09/28/how-to-simulate-this-special-recursive-update-and-merge-of-data-in-sql/

Since there is not insert statement provided, I will just reuse my old testing data.

create table  T (C1 varchar2(10), C2 varchar2(10));
insert into  T values('P501',      'C101');
insert into  T values('P501',      'C102');
insert into  T values('P502',      'C105');
insert into  T values('P503',      'C102');
insert into  T values('P503',      'C103');
insert into  T values('P503',      'C104');
insert into  T values('P504',      'C105');
insert into  T values('P504',      'C106');
insert into  T values('P505',      'C106');
insert into  T values('P505',      'C107');
insert into  T values('P506',      'C108');

-------------------SQL Solution---------------------------- 
 
SELECT c2 as item ,  DENSE_RANK(  ) over (order by c1 ) as sets
FROM (
SELECT DISTINCT MIN(substr(str,1,
                 CASE WHEN instr(str,',')>0
                      THEN instr(str,',')-1
	              ELSE length(str) END)) OVER (PARTITION BY c1) c1,c2
  FROM
  (SELECT c1, c2, path,ltrim(sys_connect_by_path(c1,','),',') str
     FROM
     (SELECT c1,c2,rn,ltrim(sys_connect_by_path(c1,','),',') path
        FROM(SELECT c1,c2,row_number() over (PARTITION BY c2 ORDER BY c1) rn
              FROM t )
        START WITH rn = 1
        CONNECT BY c2 = PRIOR c2
        AND PRIOR rn = rn -1
     )
     CONNECT BY PRIOR substr(path,instr(path,',',-1)+1) =
	              substr(path,0,instr(path,',',1)-1)
     AND c1 > PRIOR c1
  )
);

ITEM             SETS                                                           
---------- ----------                                                           
C101                1                                                           
C102                1                                                           
C103                1                                                           
C104                1                                                           
C105                2                                                           
C106                2                                                           
C107                2                                                           
C108                3       
Regards,

Frank

[Updated on: Mon, 25 February 2008 12:43] by Moderator

Report message to a moderator

Re: Need help with Sql query [message #320454 is a reply to message #302439] Thu, 15 May 2008 03:01 Go to previous message
mvudugul
Messages: 1
Registered: May 2008
Junior Member
I haved added two more rows to the table T.
insert into T values('P506', 'C104');
insert into T values('P504', 'C108');

and when I run the SQL stmt. it is returning the following output.

ITEM SETS
C101 1
C108 1
C104 1
C102 1
C103 1
C108 2
C107 2
C106 2
C105 2

C108 is appearing twice with the follwoing values.
C108 1
C108 2.

Out requirements is that it should appear only once and with
the following value.
C108 1

and finally output should be as follow,
ITEM SETS
C101 1
C108 1
C104 1
C102 1
C103 1
C107 1
C106 1
C105 1

Please advice the changes required to original SQL stmt.

-Thanks, Mahesh.
Previous Topic: Cursor
Next Topic: Need Only Numbers
Goto Forum:
  


Current Time: Fri Dec 02 23:15:12 CST 2016

Total time taken to generate the page: 0.08144 seconds