Designing Database with reporting tables [message #432222] |
Mon, 23 November 2009 01:27 |
virmahi
Messages: 266 Registered: April 2008 Location: India
|
Senior Member |
|
|
Hi,
I have to design a database for an online application with many concurrent users. There would be many reports fetching data from transaction tables. In my previous experience, I found that when transaction tables with loads of data and loads of INSERTS/UPDATE operations, are accessed for heavy reporting...it makes the application perform very slow.
Before desiging , I thought to sort this thing first.
I thought of creating seperate tables for reporting. This would consume performance to flush data in reporting tables but reporting would be faster. The reports will be run any time and updated data would be expected at any point of time.
Options with my understanding are :-
1).Ceate trigger that will insert/update data in reporting tables when inserting/updating the transaction tables.
2).Ceate materialized views that will refresh the data in reporting tables as soon as transaction tables are updated.
Please suggest me if there are other better options or which one of the above would be better in performance.
Please advice.
Regards,
Mahi
[Updated on: Mon, 23 November 2009 01:34] Report message to a moderator
|
|
|
|
Re: Designing Database with reporting tables [message #432230 is a reply to message #432226] |
Mon, 23 November 2009 02:29 |
virmahi
Messages: 266 Registered: April 2008 Location: India
|
Senior Member |
|
|
Hi Michel,
Thanks for your suggestion. I have not worked on Materialized views ...just read about it.. I had one question---
Will (refreshing materialised views + Selecting data from materialised views ) will be faster than selecting data from transaction tables..... don't the report will come slower as refreshing the MV will take some time? I am not very familiar to MVs so asked this question.
Regards,
Mahi
|
|
|
|
Re: Designing Database with reporting tables [message #432233 is a reply to message #432231] |
Mon, 23 November 2009 02:49 |
virmahi
Messages: 266 Registered: April 2008 Location: India
|
Senior Member |
|
|
Michel,
Yours answers are correct but I'm now confused as which would be the best way while designing database....
Please let me know if you have some other suggestion as the performance is of prime concern in this application as it would be online. So both DML as well as reporting should go fast hand in hand....
|
|
|
|
Re: Designing Database with reporting tables [message #432259 is a reply to message #432233] |
Mon, 23 November 2009 05:09 |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
One solution that I've seen used to great effect is this.
Create a standby database based on your live database. Assuming you run your reports off overnight, then every evening you recover the standby database to match the live system,
You then open the standby database as read only, run the reports, and when you've finished, return the database to being a standby.
This lets the live db run it's overnight batch jobs without the IO load caused by the reports.
|
|
|
Re: Designing Database with reporting tables [message #432260 is a reply to message #432222] |
Mon, 23 November 2009 05:14 |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
Quote:I found that when transaction tables with loads of data and loads of INSERTS/UPDATE operations, are accessed for heavy reporting...it makes the application perform very slow.
In some databases, updates and inserts can block selects, or lead to inaccurate results from selects.
This is not the case with Oracle - Oracle guarantees that the results that you get from a query will show all of the committed changes present at the time that the query was run, and none of the changes committed afterwards, and your query will not wait on any other changes being committed.
That being said, if you are trying to do a lot of inserts/updates to tables, and are also trying to read a lot of data from the tables at the same time, you may well hit the i/o limits of you system, and that will cause you to slow down.
There's not a great deal you can do about that.
Materialized views can help, as you are just reading data from one single source, rather than from many tables.
Query tuning can help reduce the I/O caused by your queries
But, at the end of the day running lots of reports at the same time as your overnight batch jobs will cause things to run more slowly.
|
|
|
|
Re: Designing Database with reporting tables [message #432287 is a reply to message #432275] |
Mon, 23 November 2009 07:51 |
JRowbottom
Messages: 5933 Registered: June 2006 Location: Sunny North Yorkshire, ho...
|
Senior Member |
|
|
It depends a lot on your system
If your reports use simple queries, then it should be possible to create On Commit Refresh materialized views that would provide a real time set of data to query on - your reports would run faster, but everything else would run slightly slower.
Otherwise, I think you're stuck with the performance hit of running the queries.
|
|
|