Home » SQL & PL/SQL » SQL & PL/SQL » performance issue on fetching values from an xml file stored into a CLOB column in DB (merged 2 thre
performance issue on fetching values from an xml file stored into a CLOB column in DB (merged 2 thre [message #198526] Tue, 17 October 2006 09:04 Go to next message
keshavpradeep
Messages: 3
Registered: October 2006
Junior Member
Hi,

I have loaded an xml file into a CLOB column in db.
One of the important content of this xml file is the below tag
<ARTICLE mode="new">
<SUPPLIER_AID>1234</SUPPLIER_AID>
<ARTICLE_DETAILS>
<DESCRIPTION_SHORT>short</DESCRIPTION_SHORT>
<DESCRIPTION_LONG>long desc</DESCRIPTION_LONG>
<EAN>ad123</EAN>
<SUPPLIER_ALT_AID>454AD</SUPPLIER_ALT_AID>
....
....
....
</ARTICLE_DETAILS>
.....
.....
</ARTICLE>

This aticle tag is repetitive in the xml file.
Under each of this Article tag, there exists somewhere around 125 child tags. So all the values under these 125 child tags, corresponds to a particular item data.

my requirement here is to fetch the data under these 125 odd child tags and store them to 125 odd columns in a target table.

Effectively it means to say that, all the values under 1 article tag goes into the target table(with 125 odd columns) as 1 record.
so if there are 50 article tags, 50 records should be inserted into the target table.

I am able to select the data (for inserting into target table)from the CLOB column as below
select extractValue(value(i),'/ARTICLE/SUPPLIER_AID/text()') supid,
extractValue(value(i),'/ARTICLE/ARTICLE_DETAILS/DESCRIPTION_SHORT/text()') shortdesc,
extractValue(value(i),'/ARTICLE/ARTICLE_DETAILS/DESCRIPTION_LONG/text()') longdesc,
.
.
from xxx,
table ( xmlsequence( extract (xmltype(xxx.column), '/TYPEA/CATALOG/ARTICLE')))i

This selection and insertion into target table for an xml file with 750 article tags (items) takes around 10 min.
But now i have an xml file which has around 200,000 article tags (i.e 200,000 items data), which takes quite a huge amount of time.

Is there any way to tune my above sql to this sort of huge data ?
Also is there any other approach which helps me do this activity in a much better fashion.

Any help or suggestions in this regard is greatly appreciated.

performance issue on fetching values from an xml file stored into a CLOB column in DB [message #199131 is a reply to message #198526] Fri, 20 October 2006 06:16 Go to previous message
keshavpradeep
Messages: 3
Registered: October 2006
Junior Member
Hi,

I have loaded an xml file into a CLOB column in db.
One of the important content of this xml file is the below tag
<ARTICLE mode="new">
<SUPPLIER_AID>1234</SUPPLIER_AID>
<ARTICLE_DETAILS>
<DESCRIPTION_SHORT>short</DESCRIPTION_SHORT>
<DESCRIPTION_LONG>long desc</DESCRIPTION_LONG>
<EAN>ad123</EAN>
<SUPPLIER_ALT_AID>454AD</SUPPLIER_ALT_AID>
....
....
....
</ARTICLE_DETAILS>
.....
.....
</ARTICLE>

This aticle tag is repetitive in the xml file.
Under each of this Article tag, there exists somewhere around 125 child tags. So all the values under these 125 child tags, corresponds to a particular item data.

my requirement here is to fetch the data under these 125 odd child tags and store them to 125 odd columns in a target table.

Effectively it means to say that, all the values under 1 article tag goes into the target table(with 125 odd columns) as 1 record.
so if there are 50 article tags, 50 records should be inserted into the target table.

I am able to select the data (for inserting into target table)from the CLOB column as below
select extractValue(value(i),'/ARTICLE/SUPPLIER_AID/text()') supid,
extractValue(value(i),'/ARTICLE/ARTICLE_DETAILS/DESCRIPTION_SHORT/text()') shortdesc,
extractValue(value(i),'/ARTICLE/ARTICLE_DETAILS/DESCRIPTION_LONG/text()') longdesc,
.
.
from xxx,
table ( xmlsequence( extract (xmltype(xxx.column), '/TYPEA/CATALOG/ARTICLE')))i

This selection and insertion into target table for an xml file with 750 article tags (items) takes around 10 min.
But now i have an xml file which has around 200,000 article tags (i.e 200,000 items data), which takes quite a huge amount of time.

Is there any way to tune my above sql to this sort of huge data ?
Also is there any other approach which helps me do this activity in a much better fashion.

Any help or suggestions in this regard is greatly appreciated.
Previous Topic: PERFORMANCE
Next Topic: Minus Question
Goto Forum:
  


Current Time: Sun Dec 11 05:58:13 CST 2016

Total time taken to generate the page: 0.10276 seconds