Oracle External File and UTF-8 Query [message #486450] |
Wed, 15 December 2010 11:04  |
 |
mmggmm
Messages: 4 Registered: December 2010
|
Junior Member |
|
|
Hi All,
We are looking for ideas as to why we are encountering
an issue with an Oracle external table.
We get the following error when we load a particular
file in this table:
ORA-12801: error signaled in parallel query server P000
ORA-29913: error in executing ODCIEXTTABLEFETCH callout
ORA-30653: reject limit reached
ORA-06512: at "SYS.ORACLE_LOADER", line 52
We have narrowed this down to a text field in the file that
contains the following text (text obfuscated):
XXXXXXXX ł XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
This is nominally 40 characters long, which matches the
maximum field size, but it is being rejected because it
appears the 'ł' character is causing Oracle to interpret
the length as 41 characters instead. If I remove a X
then the file loads without issue.
We tried this in a new schema we created where we added
the same table and used the same file. There was no
problem at all.
The oracle database has the following settings:
NLS_CHARACTERSET UTF8
NLS_NCHAR_CHARACTERSET UTF8
NLS_LENGTH_SEMANTICS BYTE
NLS_LANGUAGE AMERICAN
The table is defined as follows:
CREATE TABLE XXXXXXXXXXXXXXXXXXXXXX.XXXXXXXXX_INTERFACE
(
XXXXXXXXXXXXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXXXXX VARCHAR2(40 CHAR),
XXXXXXXXXXXXXXXXXXXXX NUMBER(13),
XXXXXXXXXXXX VARCHAR2(40 CHAR)
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY XXXXXXXXXXXXXXXX_L
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE XXXXXXXXXXXXXXXX_L:'customer.bad'
NOLOGFILE
FIELDS TERMINATED BY"\t" LRTRIM MISSING FIELD
VALUES ARE NULL
)
LOCATION (XXXXXXXXXXXXXXXX_L:'external.dat')
)
REJECT LIMIT 0
PARALLEL ( DEGREE 4 INSTANCES 1 )
NOMONITORING;
We tried adding the following attributes but they did not
seem to make any difference:
CHARACTERSET UTF8
STRING SIZES ARE IN CHARACTERS
Any assistance would be much appreciated.
Thanks,
Michael
|
|
|
|
| Re: Oracle External File and UTF-8 Query [message #486453 is a reply to message #486450] |
Wed, 15 December 2010 11:22   |
ThomasG
Messages: 3212 Registered: April 2005 Location: Heilbronn, Germany
|
Senior Member |
|
|
Quote:
We tried this in a new schema we created where we added
the same table and used the same file. There was no
problem at all.
Hmmmm.. Nothing wrong that I could see at first glance. A few questions to perhaps pin down the problem:
What is the difference between the external table that works, and the external table that doesn't work? Only the schema?
Is the definition you posted the one that does work or the one that doesn't work?
You get the error doing what exactly? A straight select from the external table?
Can you create a new external table in the original schema, only change the file name of the data file? Can you add a LOGFILE to the table definition that doesn't work? Perhaps that will make the exact source of the error clearer.
|
|
|
|
|
|
| Re: Oracle External File and UTF-8 Query [message #486457 is a reply to message #486455] |
Wed, 15 December 2010 11:45   |
 |
LKBrwn_DBA
Messages: 487 Registered: July 2003 Location: WPB, FL
|
Senior Member |
|
|
mmggmm wrote on Wed, 15 December 2010 12:38... etc ...
... The first created the table using bytes for field lengths, and the second script altered the table to use char for the field lengths.
... etc ...
Then there is a difference... Use the one that works: the bytes or the characters?
[Updated on: Wed, 15 December 2010 12:03] by Moderator Report message to a moderator
|
|
|
|
|
|
| Re: Oracle External File and UTF-8 Query [message #486460 is a reply to message #486458] |
Wed, 15 December 2010 12:49   |
ThomasG
Messages: 3212 Registered: April 2005 Location: Heilbronn, Germany
|
Senior Member |
|
|
I'm pretty sure that it will work, once you re-create the external table directly with the CHAR semantic.
Of course, you might also want to discuss with the customer if there might be a problem in the further tables that data will be processed into. If they are all defined in BYTE semantic, then you might run into further problems when that data gets processed.
|
|
|
|
|
|