External Table - BADFILE overwrites every time. [message #236102] |
Wed, 09 May 2007 01:03 |
Brayan
Messages: 315 Registered: June 2002
|
Senior Member |
|
|
Hi,
I have a module, which loads data from flat(csv) files. We are using External talbles for this purpose.
Following is a sample code.
:
:
ORGANIZATION EXTERNAL
(
TYPE ORACLE_LOADER
DEFAULT DIRECTORY EXT_CBS_DIR
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE
BADFILE BOIDW_BAD_DIR:'bad_cbs_SAVINGS_ext'
LOGFILE BOIDW_LOG_DIR:'log_cbs_SAVINGS_ext'
DISCARDFILE BOIDW_DISC_DIR:'disc_cbs_SAVINGS_ext'
FIELDS TERMINATED BY ','
RTRIM
MISSING FIELD VALUES ARE NULL
:
:
And, Followig is the sample Code how we load the data. And, the below mentioned code runs ina loop.
ALTER TABLE EXT_TABLE LOCATION('file1.csv');
INSERT INTO TABLE1 SELECT * FROM EXT_TABLE;
ALTER TABLE EXT_TABLE LOCATION('file2.csv');
INSERT INTO TABLE1 SELECT * FROM EXT_TABLE;
ALTER TABLE EXT_TABLE LOCATION('file3.csv');
INSERT INTO TABLE1 SELECT * FROM EXT_TABLE;
My problem here is, the LOGFILE(log_cbs_SAVINGS_ext)contains all the log and error entries i,e. file1.csv, file2.csv and file3.csv.
But the BADFILE contains bad records of the last file only, i.e file3.csv. But, I want a incremenal of all the bad records. In this case its of file3.csv. I.e For every read it overwrites the BADFILE.
Can anybody help me.
Brayan.
|
|
|
|
|
Re: External Table - BADFILE overwrites every time. [message #236178 is a reply to message #236102] |
Wed, 09 May 2007 04:25 |
Brayan
Messages: 315 Registered: June 2002
|
Senior Member |
|
|
Hi,
Hi,
Quote: | Did you try:
alter table ext_table access parameters (badfile 'new_badfile_each_time');
|
No, I did not run above said command.
Ross,
According to this document, Even if we specify %p it will overwrite in my case. Beause in one process, I have multiple files to be read. So, everytime I read a file, bad records are overwritten in the bad file.
Brayan.
|
|
|
Re: External Table - BADFILE overwrites every time. [message #236532 is a reply to message #236178] |
Thu, 10 May 2007 03:11 |
rleishman
Messages: 3728 Registered: October 2005 Location: Melbourne, Australia
|
Senior Member |
|
|
Are you going to change the contents of the EOT data file whilst the process is running?
If so, then the same process that replaces the data file should archive the old badfile.
If not, then the BADFILE is just going to contain one copy of the bad rows no matter how many times you read the EOT. Surely this is a good thing.
Ross Leishman
|
|
|
Re: External Table - BADFILE overwrites every time. [message #236545 is a reply to message #236102] |
Thu, 10 May 2007 04:22 |
Brayan
Messages: 315 Registered: June 2002
|
Senior Member |
|
|
Hi Ross,
Are you going to change the contents of the EOT data file whilst the process is running?
No, I'm changing the contents of the EOT(flat files) while the process is reading data.
If so, then the same process that replaces the data file should archive the old badfile.
If not, then the BADFILE is just going to contain one copy of the bad rows no matter how many times you read the EOT. Surely this is a good thing.
That's correct.
But, My requirement is to know how many bad records were there in an entire upload process.
E.g
ALTER TABLE EXT_TABLE LOCATION('file1.csv');
INSERT INTO TABLE1 SELECT * FROM EXT_TABLE;
ALTER TABLE EXT_TABLE LOCATION('file2.csv');
INSERT INTO TABLE1 SELECT * FROM EXT_TABLE;
ALTER TABLE EXT_TABLE LOCATION('file3.csv');
INSERT INTO TABLE1 SELECT * FROM EXT_TABLE;
Suppose in the above scenario, assume each file(file1.csv, file2.csv, file3.csv) contain 2 bad records. While I run the process I'll get only tow records instead of six.
I understand that this is a normal behaviour. Is there a way to do, what I want, instead of archive the BADFILE after each read of csv file.
Brayan.
|
|
|
|
|
|
|
|