Re: AWR - DB Time by User Trend Report?

From: Kerry Osborne <>
Date: Fri, 11 Sep 2009 17:12:42 -0500
Message-Id: <>


   Here's the basic script pulled from statspack and then modified to use the equivalent AWR tables instead. (it produces the same numbers as the standard AWR report). I modified it to sort by dbtime though so I could see which periods were busiest in terms of dbtime. It uses DBA_HIST_SYS_TIME_MODEL (which is loaded from V$SYS_TIME_MODEL) as mentioned by Lei. This view doesn't have user or session info though. Nor do the underlying X$ structures (x$kewssmap and x$kewssysv). V $SESS_TIME_MODEL does have session info so maybe you could pull something from there if you really need finer grained info (i.e. by user). Anyway, here is the script. (there is a little more info on my blog about this script if you're interested) set lines 155
col dbtime for 999,999.99
col begin_timestamp for a40
select begin_snap, end_snap, timestamp begin_timestamp, inst, a/ 1000000/60 DBtime from
  e.snap_id end_snap,
  lag(e.snap_id) over (order by e.snap_id) begin_snap,   lag(s.end_interval_time) over (order by e.snap_id) timestamp,   s.instance_number inst,
  nvl(value-lag(value) over (order by e.snap_id),0) a from dba_hist_sys_time_model e, DBA_HIST_SNAPSHOT s where s.snap_id = e.snap_id
  and e.instance_number = s.instance_number   and to_char(e.instance_number) like
nvl('&instance_number',to_char(e.instance_number))   and stat_name = 'DB time'
where begin_snap between nvl('&begin_snap_id',0) and nvl('&end_snap_id',99999999)
and begin_snap=end_snap-1
order by dbtime desc
where rownum < 31
Kerry Osborne

On Sep 9, 2009, at 8:13 PM, Jeffery Thomas wrote:

> Hello all -
> I'm composing a few historical reports from the AWR, using this link
> as a guide:
> One report I'm trying to develop is a "DB Time by User" trend report -
> if at all practical - but I'm having
> difficulty determining which DBA_HIST* tables are relevant with
> respect to aggregating by the user.
> Would anyone a similar report or at least, what tables may contain
> the necessary information?
> Thanks -
> Jeff
> --

Received on Fri Sep 11 2009 - 17:12:42 CDT

Original text of this message