Home » SQL & PL/SQL » SQL & PL/SQL » Enforcing a processing time limit (9i)
Enforcing a processing time limit [message #275999] Tue, 23 October 2007 10:49 Go to next message
mf_prog
Messages: 3
Registered: October 2007
Junior Member
I've been asked to automate a report that users request regularly. Due to the nature of the data, some reports take a couple of seconds while others take a few minutes and a couple an eternity

The report will be requested from a web page and the contents have to be returned into a SYS_REFCURSOR.

I've been told I shouldn't have the query hammering the database so I thought I'd introduce a FOR LOOP and CURSOR on the main table in my query. As the report builds I'd check how long it had been running and if more than say 30 seconds I'd exit.

The web page would be sent info to tell it that the query had not completed and it would then offer the user a new button to press "get more data". If selected, the next time in I'd run a slightly different query on my main table and continue processing where I had left off.

The problem with this approach is for each row in my main table I do the OPEN FOR on the SYS_REFCURSOR and I end up with the only the details I found associated with the last row I read in the main table.

My solution is typical CICS COBOL type way of getting the user to view all the data.

First, can anyone suggest a different approach so that these reports can be requested and viewed on line, ie. no overnight batch work?

If not can you suggest a way of building the report and ensuring that the web page gets sent all the data found?
Re: Enforcing a processing time limit [message #276021 is a reply to message #275999] Tue, 23 October 2007 12:29 Go to previous messageGo to next message
MarcL
Messages: 455
Registered: November 2006
Location: Connecticut, USA
Senior Member
Well one work around that I can come up with is to insert your ref cursor data into a table.
icon10.gif  Re: Enforcing a processing time limit [message #276023 is a reply to message #276021] Tue, 23 October 2007 12:32 Go to previous messageGo to next message
mf_prog
Messages: 3
Registered: October 2007
Junior Member
Do you want to give the DBA a coronary
Re: Enforcing a processing time limit [message #276026 is a reply to message #276023] Tue, 23 October 2007 12:36 Go to previous messageGo to next message
MarcL
Messages: 455
Registered: November 2006
Location: Connecticut, USA
Senior Member
mf_prog wrote on Tue, 23 October 2007 12:32

Do you want to give the DBA a coronary



It won't be the first time in my career, and I'm sure not the last Laughing
Re: Enforcing a processing time limit [message #276038 is a reply to message #276026] Tue, 23 October 2007 13:20 Go to previous messageGo to next message
andrew again
Messages: 2577
Registered: March 2000
Senior Member
I guess it goes without saying that tuning your queries to the max is the best approach because your problem may go away by itself. Limiting resources needs to be used with great care to avoid making overal load worse (due to a user simply re-submitting the same report until it finally succeeds). Make sure that the user can't enter search criteria that are too "wild".

http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:556622931374


Using dbms_application_info, you can also limit the number of concurrent reports or whatever without even using resource profiles.
--pseudo code...
select count(*) into v_dummy from v$session where client_info = 'my big report'; 

-- allow max of 3 copies of "my big report"
if v_dummy > 3 then 
  msg to user "please try later"
else 
  --dbms_application_info.set_module( 'my-module', 'my-action'); 
  dbms_application_info.set_client_info( 'my big report' ); 

  -- run query now...  

  -- done
  dbms_application_info.set_client_info( 'done' ); 
end if;
  exception
  when others then
    dbms_application_info.set_client_info( substr(sqlerrm, 1, 30) ); 
end;

Re: Enforcing a processing time limit [message #276174 is a reply to message #276038] Wed, 24 October 2007 04:52 Go to previous message
mf_prog
Messages: 3
Registered: October 2007
Junior Member
Thanks for that. That looks like something I might have a use for in the future, but in this case I'm not actually overly concerned about there being more than one concurrent request for this paricular report as the user said that they will not be requesting more than one report at a time. And I believe him, as I trust users completely Laughing

The main input criteria to the report is the manager client and a date range that I've limited to 12 months.

The 'main table' contains a list of clients and the identity of their manager client. For the vast majority the client is the manager client, or the manager clients has only a small number of clients. There is however a few manager clients that have many hundreds of clients, one with 5000+ , a few with 1500, 1000 and a few hundred, and these are the ones that could potentiall cause a problem.

Having said that some of these large manager clients have a lot of clients who are not generating that much data so the report for them is fairly quick. However, there are some clients who are incredibly busy and they can take a long time to query. Have four or five of these sitting under a single manager client and you can see how the query could take too long, even for a 'small' manager client.

I'm exploring the possibility of using table functions as described in Feurerstein's Oracle PL/SQl programming book, chapter 17.

p.s. Part of my plan was to ask the web developer to prevent the user from asking for 'more data' more than three times. He'll give them a message to contact us and we'll run the job against the database clone. Yes we have a day old copy of the live data, but the web application is not currently allowed to talk to it. These things are sent to try us!
Previous Topic: Viewing the old and new value in the local area of a trigger
Next Topic: Java problems
Goto Forum:
  


Current Time: Thu Feb 06 10:48:48 CST 2025