Situation is that every now and then we'll get a request to load an
extraordinary amount of data into some tables. All kinds of gotchas occur
that stop the load in its tracks - comm errors, dirty data, space, etc.
The standard routine is to truncate the tables, turn off foreign keys,
load the data, turn the keys on. Any suggestions on how a bunch of real
rookies can change this procedure to perhaps break the input files into
smaller chunks and and not deviate too much from this routine? It's the
truncating that is the first hill to climb.
Thanxalot for info. I've been into this stuff for two weeks and can use
any kind of help you can offer. preeeeeeeeeeeeeeciate it.
--
G.T. Jeff Cherer gcherer_at_texas.net
Voice: 210-532-7524 SnailMail: 1132 Vanderbilt St. 78210
"Comfort the afflicted: Afflict the comfortable."