Date: Tue, 11 Mar 2008 07:47:16 -0700 (PDT)
I am creating a crosstab view which presents the information at the row level and produces it in a columnar format. I am creating a view that uses dynamic sql to create the case statements for each of these columns. The input table can have 120M rows which need to be split into 200K rows with 600 columns each. My question is regarding the runtime performance of such a view.
This is a datawarehouse application and the resulting view will be used in queries and reports. Will it be a performance hog creating and running 600 case statements on the fly everytime the report is run?
The other option that I have is to create a table with 600 columns and load the table with the values and then use that result table in the reports.
I appreciate your response.
Arvind. Received on Tue Mar 11 2008 - 09:47:16 CDT