Re: Allocating memory for Bulk Inserts

From: Jan Gelbrich <j_gelbrich_at_westfalen-blatt.de>
Date: Mon, 5 Jan 2004 10:03:37 +0100
Message-ID: <btb9en$51mc1$1_at_ID-152732.news.uni-berlin.de>


"oidba" <oidba_at_yahoo.com> schrieb im Newsbeitrag news:e61db2dc.0312312141.7a72e1d6_at_posting.google.com...
> Is there an alternative to pro*c arrays for bulk inserts. ....

> SQL*LOADER is not an option

[Quoted] I am sorry: Why not ?

> since there is a lot of data cleaning/manipulation that goes on
> prior to inserting in addition to the transaction control required for
> each file received.

Even so I would load all data into some staging tables á la FLAT_myTable1 a.s.o.
and then do all manipulations
inside the DB by using a stored procedure (PL/SQL). Thus it is also easier *and* safer to do test runs before running it into the real tables - any sh** could be in the raw data. All You need to know is some PL/SQL
and the data model of your tables.

>
> Basically, I am looking for a feature that allows me to allocate
> memory as I go, something like a linked list.

I would not see anymore need for it.
Loading some Staging Tables and some PL/SQL should do all You seem to want.

hth, Jan Received on Mon Jan 05 2004 - 10:03:37 CET

Original text of this message