Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
![]() |
![]() |
Home -> Community -> Usenet -> c.d.o.misc -> Re: Updating Relational Data with XML.
LineVoltageHalogen wrote:
> Daniel, first of all thanks for taking the time to answer my question.
> We are developing a new software application in house. This software
> generates revenue schedules and for some of our larger clients this
> could be up to 100,000 records per operation and this could happen
> several times a day. Now, our software is built completely around .NET
> and SOA and the database server (we support sql and oracle on the back
> end) will most likely not live on the same box as our application. Our
> goal is to avoid having to make N database calls for N revenue
> schedules and we have done a POC that clearly shows that when we have
> this many records going into the database (inserts only for multiple
> tables) the overhead of opening and re-opening the connection to call a
> procedure or package (in the case of oracle back end) is large. This
> coupled with network latency issues could make the process fall outside
> the accepted defined performance requirements. To pass once large clob
> or text object to a package or procedure would remove all that latency
> (network and connnection) and the overhead of parsing the xml is far
> less and would save much time. For example our app did a simple insert
> (three tables involved: Rev Header, Rev Detail, Account Detail) of
> 20,000 schedules and it took about 12 minutes to complete. I had done
> a trace on the database server and found the average time on the
> database server per schedule operation was only 3 ms which only
> accounted for only 1 minute of the 12 minutes of total processing time
> from the application end.
>
> Since I posted yesterday I figured out how to do the inserts with
> Oracle (I already understand how to do it with MSSQL) using xpath if
> the insert involves multiple tables.
>
> If you have first hand knowledge of this type of problem I would
> appreciate any words of wisdom that you may impart.
>
> TFD
You've already solved the problem but here is the solution that I
helped implement in one company with, what I think was, a similar
issue.
SqlStr :=
'BEGIN
INSERT INTO ...
INSERT INTO ...
UPDATE ...
INSERT INTO ...
DELETE ...
UPDATE ...
INSERT INTO ...
END;'
Then passed SqlStr to the back end and processed it using DBMS_SQL
which has no problem handling a 4GB transaction.
-- Daniel A. Morgan University of Washington damorgan_at_x.washington.edu (replace 'x' with 'u' to respond)Received on Wed Apr 27 2005 - 13:22:03 CDT
![]() |
![]() |