Re: Analytical query help

From: Charles Hooper <hooperc2000_at_yahoo.com>
Date: Tue, 8 Dec 2009 04:54:35 -0800 (PST)
Message-ID: <cbd42a79-fa3c-471c-81fc-3651216b1ac1_at_d10g2000yqh.googlegroups.com>



On Dec 7, 10:32 pm, Charles Hooper <hooperc2..._at_yahoo.com> wrote:
> On Dec 7, 4:01 pm, RA <ang..._at_gmail.com> wrote:
> > Hi gurus,
>
> > Could anyone show me how to write a analytical query for my
> > requirement.
>
> > Query
> > ============
> > SELECT distinct bug_when,
> >        login_name,
> >        SUM(incoming_count - outgoing_count)
> >         OVER (PARTITION BY login_name ORDER BY bug_when) AS OPEN
> > FROM RM_COMP_INCOM_OUTGO_BUGIDS_GT G,
> >      bz_components c,
> >      bz_profiles p
> > where  G.component_id= c.ID and
> > c.manager = p.userid
> > order by 2,1 desc;
>
> > Query returns data like this
> > ===================
> > 12/04/2009 ssmit 52
> > 12/02/2009 ssmit 48
> > 11/30/2009 ssmit 45
> > 11/29/2009 ssmit 42
>
> > I want the data to be like this
> > ======================
> > 12/04/2009 ssmit 52
> > 12/03/2009 ssmit 48 -- fill the gap with previous value
> > 12/02/2009 ssmit 48
> > 12/01/2009 ssmit 45
> > 11/30/2009 ssmit 45
> > 11/30/2009 ssmit 45
> > 11/29/2009 ssmit 42
>
> > Table Desc
> > ==========
> > CREATE GLOBAL TEMPORARY TABLE RM_COMP_INCOM_OUTGO_BUGIDS_GT
> > (
> >   BUG_WHEN        DATE,
> >   COMPONENT_ID    NUMBER,
> >   INCOMING_COUNT  NUMBER,
> >   OUTGOING_COUNT  NUMBER
> > );
>
> > Regards
> > RA
>
> You might want to take a look at this link, which shows how to supply
> the missing dates:http://hoopercharles.wordpress.com/2009/12/07/sql-filling-in-gaps-in-...
>
> The LAG or LEAD analytic functions may be used to "peek" at the
> previous/next row if the SUM() column produces a NULL value:http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/functi...
>
> You might also want to look through this thread if the LAG analytic
> function does not work - I think that I had a couple interesting posts
> in that thread that involved peeking at previous rows:http://groups.google.com/group/comp.databases.oracle.server/browse_th...

Just a follow up with a test case in the event my previous post was not clear.

CREATE TABLE T1
(
  BUG_WHEN DATE,
  COMPONENT_ID VARCHAR2(10),
  INCOMING_COUNT NUMBER,
  OUTGOING_COUNT NUMBER
);

INSERT INTO T1 VALUES (TO_DATE('12/07/2009','MM/DD/YYYY'),'ssmit', 60,60);
INSERT INTO T1 VALUES (TO_DATE('12/04/2009','MM/DD/YYYY'),'ssmit', 52,52);
INSERT INTO T1 VALUES (TO_DATE('12/02/2009','MM/DD/YYYY'),'ssmit', 48,48);
INSERT INTO T1 VALUES (TO_DATE('11/30/2009','MM/DD/YYYY'),'ssmit', 45,45);
INSERT INTO T1 VALUES (TO_DATE('11/29/2009','MM/DD/YYYY'),'ssmit', 42,42);
INSERT INTO T1 VALUES (TO_DATE('01/29/2009','MM/DD/YYYY'),'test', 42,42);

COMMIT; SELECT
  *
FROM
  T1;

BUG_WHEN COMPONENT_ INCOMING_COUNT OUTGOING_COUNT

--------- ---------- -------------- --------------
07-DEC-09 ssmit                  60             60
04-DEC-09 ssmit                  52             52
02-DEC-09 ssmit                  48             48
30-NOV-09 ssmit                  45             45
29-NOV-09 ssmit                  42             42
29-JAN-09 test                   42             42

Let's start with using the LEAD function to peek at the next row from the T1 table.
SELECT
  COMPONENT_ID,
  BUG_WHEN,
  LEAD(BUG_WHEN,1) OVER (PARTITION BY COMPONENT_ID ORDER BY BUG_WHEN) NEXT_BUG_WHEN,
  INCOMING_COUNT
FROM
  T1;

COMPONENT_ BUG_WHEN NEXT_BUG_ INCOMING_COUNT

---------- --------- --------- --------------
ssmit      29-NOV-09 30-NOV-09             42
ssmit      30-NOV-09 02-DEC-09             45
ssmit      02-DEC-09 04-DEC-09             48
ssmit      04-DEC-09 07-DEC-09             52
ssmit      07-DEC-09                       60
test       29-JAN-09                       42

Above needs to be modified to permit ranges by subtracting 1 from the next date. Also we need to fix the NULL values on the last row of each COMPONENT_ID grouping.

SELECT
  COMPONENT_ID,
  BUG_WHEN,
  NVL(LEAD(BUG_WHEN,1) OVER (PARTITION BY COMPONENT_ID ORDER BY BUG_WHEN)-1,BUG_WHEN) THROUGH_BUG_WHEN,
  NVL(LEAD(BUG_WHEN,1) OVER (PARTITION BY COMPONENT_ID ORDER BY BUG_WHEN)-1,BUG_WHEN)-BUG_WHEN RANGE_DAYS,   INCOMING_COUNT
FROM
  T1;

COMPONENT_ BUG_WHEN THROUGH_B RANGE_DAYS INCOMING_COUNT

---------- --------- --------- ---------- --------------
ssmit      29-NOV-09 29-NOV-09          0             42
ssmit      30-NOV-09 01-DEC-09          1             45
ssmit      02-DEC-09 03-DEC-09          1             48
ssmit      04-DEC-09 06-DEC-09          2             52
ssmit      07-DEC-09 07-DEC-09          0             60
test       29-JAN-09 29-JAN-09          0             42

Now, all we need to do is find a way to generate the missing dates between the BUG_WHEN date and the THROUGH_BUG_WHEN date. If we slide the above into an inline view and join to an inline view with a counter generator, we end up with the following: SELECT

  T1.COMPONENT_ID,
  T1.BUG_WHEN+V1.C BUG_WHEN,
  T1.INCOMING_COUNT

FROM
  (SELECT
    COMPONENT_ID,
    BUG_WHEN,
    NVL(LEAD(BUG_WHEN,1) OVER (PARTITION BY COMPONENT_ID ORDER BY BUG_WHEN)-1,BUG_WHEN) THROUGH_BUG_WHEN,
    NVL(LEAD(BUG_WHEN,1) OVER (PARTITION BY COMPONENT_ID ORDER BY BUG_WHEN)-1,BUG_WHEN)-BUG_WHEN RANGE_DAYS,     INCOMING_COUNT
  FROM
    T1) T1,
  (SELECT
    ROWNUM-1 C
  FROM
    DUAL
  CONNECT BY
    LEVEL<=100) V1
WHERE
  V1.C<=T1.RANGE_DAYS
ORDER BY
  T1.COMPONENT_ID,
  T1.BUG_WHEN+V1.C DESC; COMPONENT_ BUG_WHEN INCOMING_COUNT
---------- --------- --------------
ssmit      07-DEC-09             60
ssmit      06-DEC-09             52
ssmit      05-DEC-09             52
ssmit      04-DEC-09             52
ssmit      03-DEC-09             48
ssmit      02-DEC-09             48
ssmit      01-DEC-09             45
ssmit      30-NOV-09             45
ssmit      29-NOV-09             42
test       29-JAN-09             42

Charles Hooper
Co-author of "Expert Oracle Practices: Oracle Database Administration from the Oak Table"
http://hoopercharles.wordpress.com/
IT Manager/Oracle DBA
K&M Machine-Fabricating, Inc. Received on Tue Dec 08 2009 - 06:54:35 CST

Original text of this message