Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: INSERT into HUGE table

Re: INSERT into HUGE table

From: <hermannkall_at_my-deja.com>
Date: 2000/08/10
Message-ID: <8muga8$mus$1@nnrp1.deja.com>#1/1

Hi Roger,

use SQL*Loader in conventional mode and all records having duplicate keys will be rejected.
Make sure that BINDSIZE and ROWS is set properly to avoid the default commit after every 64 records.

cheers
Hermann

In article <39927C92.9F93ADE1_at_alewijnse.nl>,   Roger Palmen <r.palmen.ani_at_alewijnse.nl> wrote:
> Hi all,
>
> I'm trying to find the most efficient way to insert large amounts of
> records (lets say 100K a day) into a single huge table (100M records).
> However, the problem is that the set of records to insert contains
> records already in the target table. These duplicate records can
 either
> be overwritten or neglected.
>
> A simple but effective INSERT statement fails if only one record
 doesn't
> pass the PK-constraint. What is the most efficient way to do this?
>
> - Export / SQL-loader?
> - Delete rows from source?
> - Use PL/SQL and cursor to insert one record at a time?
> - Outer join with target?
> - etc?
>
> Using Oracle 8.1.6 on NT.
>
> Thanks for advice.
> Please reply to newsgroup, or pe1pvy_at_amsat.org NOT the sender.
>
>

Sent via Deja.com http://www.deja.com/
Before you buy. Received on Thu Aug 10 2000 - 00:00:00 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US