Re: SQL-Loader question, pls

From: Matteo R. diTommaso <ditommaso_at_ebi.ac.uk>
Date: 1995/12/14
Message-ID: <NEWTNews.818961870.12922.tommaso_at_juliet.ebi.ac.uk>#1/1


In article <4al62d$4k2_at_news.texas.net>, <gcherer_at_millenium.texas.net> writes:
> Path:

ebi.ac.uk!hgmp.mrc.ac.uk!sunsite.doc.ic.ac.uk!demon!peer-news.britain.eu.net!EU net!howland.reston.ans.net!newsfeed.internetmci.com!news.texas.net!millenium.t exas.net!gcherer
> From: gcherer_at_millenium.texas.net (GT Cherer)
> Newsgroups: comp.databases.oracle
> Subject: SQL-Loader question, pls
> Date: 13 Dec 1995 00:17:17 GMT
> Organization: Texas Networking, Inc.
> Lines: 17
> Message-ID: <4al62d$4k2_at_news.texas.net>
> NNTP-Posting-Host: millenium.texas.net
> X-Newsreader: TIN [version 1.2 PL2]
>
> dude(tte)s;
> Situation is that every now and then we'll get a request to load an
> extraordinary amount of data into some tables. All kinds of gotchas occur
> that stop the load in its tracks - comm errors, dirty data, space, etc.
> The standard routine is to truncate the tables, turn off foreign keys,
> load the data, turn the keys on. Any suggestions on how a bunch of real
> rookies can change this procedure to perhaps break the input files into
> smaller chunks and and not deviate too much from this routine? It's the
> truncating that is the first hill to climb.
>
> Thanxalot for info. I've been into this stuff for two weeks and can use
> any kind of help you can offer. preeeeeeeeeeeeeeciate it.
>
> --
> G.T. Jeff Cherer gcherer_at_texas.net
> Voice: 210-532-7524 SnailMail: 1132 Vanderbilt St. 78210
> "Comfort the afflicted: Afflict the comfortable."

If I understand the problem - you want to be able to load the data and not have to restart when an error causes the job to fail:

Use two tables - once data has successfully loaded move it to the permanent table. truncate the temporary table, and restart at the point of failure.

ditommaso_at_ebi.ac.uk Received on Thu Dec 14 1995 - 00:00:00 CET

Original text of this message