Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: What is the fastes way to merge two tables?

Re: What is the fastes way to merge two tables?

From: Billy Verreynne <vslabs_at_onwe.co.za>
Date: Thu, 5 Nov 1998 09:30:08 +0200
Message-ID: <71rkac$5hc$1@hermes.is.co.za>


Minh Giang wrote in message <3640FE4A.C2857457_at_fast.net>...

>I don't know how often you insert new data to this huge table and how much
>historical data you keep.
>If it's a monthly update, you can create new table every month and sqlload
>directly into this table. Then create or update a view that points to
these
>tables.

Partition tables is also not a bad idea, if implemented and used correctly.

There's also another option that I did not mention. Using SQL*Load - but it's a bit complex the way we used to do it and it does not use standard SQL. Basically you spool the data from the source table to a flat file and then append the data in that file to the destination table using SQL*Load.

Firstly, simply spooling the data out is problematic if you have limited file system space and the source table is large. Secondly, there's the time factor. A spool, followed by a SQL*Load can be fairly slow

We finally solved both these problems as follows.

Drop indexes and disable constraints in the destination table. Create a FIFO pipe. Fire up a SQL*Loader with direct and parallel load, with the pipe as input file. Now fire up SQL*Plus and spool the data into the pipe. When SQL*Load has completed, enable constraints and add the indexes back to the destination table.

With this method, you can move fairly large amounts of data in Oracle pretty quickly. As soon as a byte is written by the spool process into the pipe, it's loaded by SQL*Loader.

Digressing from the orginal question, but this method also works fine for moving large quantities of data across the network. Let's say you have a legacy system that produces a couple of gigs every so often for loading into an Oracle data warehouse. You simple use a pipe and FTP the data from the legacy system into the pipe. On the other side of the pipe you can have EBCDIC to ASCII conversion running, reading from the FTP pipe and putting the converted data into an ASCII pipe. On the other side of the ASCII pipe you've got SQL*Loader running, reading the pipe and loading the data into Oracle using direct and parallel loading.

The nice thing of this method is that it works, it is fast and you do not have to fork out money for expensive data pump type products to get the data into Oracle.

Now try and do this with NT... <deep sigh>

regards,
Billy Received on Thu Nov 05 1998 - 01:30:08 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US