Re: faster way to create XML export

From: Jeremy Schneider <jeremy.schneider_at_ardentperf.com>
Date: Tue, 2 Aug 2011 19:43:36 -0500
Message-ID: <CA+fnDAatW3TUskz8FXY-AUnJdCwOa1EaWJm0FZNG9Dzr-c3iig_at_mail.gmail.com>



We just finished a test run with DBMS_XMLGEN on the same dataset -- it finished in 2 sec. Still used dbms_xslprocessor.clob2file for filewriting. Looks like this will do most everything I mentioned - just working on the last step now, getting the XSLT headers. (Seems to be a function for this but it's not well documented and didn't seem to work on our first try...)

Also, I noticed something in the oracle docs here: http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14252/adx_j_xsu.htm#sthref546

"Note: For increased performance, consider using DBMS_XMLGen and DBMS_XMLStore as alternatives to DBMS_XMLQuery and DBMS_XMLSave. The former packages are written in C and are built in to the database kernel. You can also use SQL/XML functions such as XML_Element for XML access in the database."

No kidding - way faster! Wish I'd seen this before! :)

-Jeremy

On Tue, Aug 2, 2011 at 6:34 PM, Jeremy Schneider < jeremy.schneider_at_ardentperf.com> wrote:

> Oh yeah, we also tested with the DBMS_XMLQuery package - it seemed to take
> about as long as the Oracle Serializer java code.
>
>
> On Tue, Aug 2, 2011 at 6:19 PM, Jeremy Schneider <
> jeremy.schneider_at_ardentperf.com> wrote:
>
>> Well we're not making any 1GB files yet, but I can't really imagine what
>> anyone does with a 400MB file either. Anyway - we were doing some testing
>> with a much smaller "test" dataset (8MB file), and we tried Marco's idea.
>> Here are the results:
>>
>> Original Code: 96 sec
>> Oracle Serializer: 40 sec
>> Marco's Code: 3 sec
>>
>> :)
>>
>> Sweet! However, it seems that there's an implicit conversion from XMLType
>> to CLOB and it's not immediately obvious how to control the XML that's
>> generated. I guess it's time for me to learn some more about XMLDB. :-/
>> Here's what we're trying to figure out how to do with XMLType:
>>
>> - xmlQuery.keepCursorState(true);
>> - xmlQuery.setMaxRows(pRowLimit_variable); // Note: we continue with
>> cursor and split output to multiple files
>> - xmlQuery.setRowTag(pRowHeader_variable);
>> - xmlQuery.setRowsetTag(pRowHeader_variable+"_MYSUFFIX");
>> - xmlQuery.setDateFormat("yyyy-MM-dd'T'HH:mm:ss");
>> - xmlQuery.setStylesheetHeader(pXSLtURI_variable);
>>
>> Any tips would be appreciated...
>>
>> -J
>>
>>
>> PS - this blog post by Marco is excellent:
>> http://www.liberidu.com/blog/?p=369
>>
>>
>> On Tue, Aug 2, 2011 at 4:41 PM, Marco Gralike <Marco.Gralike_at_amis.nl>wrote:
>>
>>> ONE 1++GB in XML document, are you serious.
>>> Wouldn't be surprised though. I see it more and more often. ;-)
>>>
>>> No the stuff showed below is in C and/or even part of the C kernel. It is
>>> possible to do in SQL and yes due to the fact that it is 1 XML document you
>>> will get in trouble regarding those mentioned sizes per XML document. I am
>>> guessing, although never really checked that if you switched it again for a
>>> C based SAV serializer it would be even faster or does it reside in the JVM
>>> database kernel?
>>>
>>> I wonder which part picks up those big XML documents and tries to do
>>> something useful with it. Work to be done with those documents afterwards
>>> most be very resource intensive ;-)
>>>
>>
>
>
> --
> http://www.ardentperf.com
> +1 312-725-9249
>
> Jeremy Schneider
> Chicago
>

-- 
http://www.ardentperf.com
+1 312-725-9249

Jeremy Schneider
Chicago

--
http://www.freelists.org/webpage/oracle-l
Received on Tue Aug 02 2011 - 19:43:36 CDT

Original text of this message