Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.misc -> SQL Performance Problem
Hi,
I have the following problem: In our oracle database there is a table with more than 200 columns. In the table there are ca. 1.5 Mio records, so overall we have 250 Mio elements in this table.
Now we want to convert the data from this table to a very simple
xml-file. Here each record is one item in the xml-file, where the names
of the columns are the namesfor the elements in the xml-file. The
creation of the xml-file results in a file of ca. 350 MB and the
generation takes about 1,5 hours. In my opinion the creation of this
file has to be faster, because to create a 350 MB file shouldn't take
more then 90 minutes.
Currently we generate the XML-file using a simple perl-script which
executes a full table scan on this table (select * from table) and
iterates through all elements). Here the main point is, that in each
record, most of the fields are empty, so that no data is written to the
XML-file which finally results in the quite small XML-file.
Is there any possibility to speed up this conversion process to XML? Is
there maybe a select-statement which excludes the null columns?
For example:
col1, col2, col3, col4
22, 99,,2
2,,,
If I do a full table scan in this scenario, I have to iterate over 8
elements. Maybe there is a possibility to only select the elements from
the table which are not empty?
Thanks for your help,
Anton
Received on Mon Mar 27 2006 - 07:24:48 CST
![]() |
![]() |