Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: NCLOB fields and CLOB.createTemporary does not work

Re: NCLOB fields and CLOB.createTemporary does not work

From: Frank van Bortel <fvanbortel_at_netscape.net>
Date: Wed, 27 Apr 2005 14:55:50 +0200
Message-ID: <d4o1tc$q76$1@news5.zwoll1.ov.home.nl>


John Hendrikx wrote:
> Frank van Bortel wrote:
>

>> John Hendrikx wrote:
>>
>>> I'm using Oracle 9i, using the latest thin JDBC Java driver.  I'm trying
>>> to insert a NCLOB into the database.  The recommended way to do this is
>>> to use CLOB.createTemporary() as you can see in the code below.  This
>>> code works fine for CLOB fields, but fails with a 'ORA-12704: character
>>> set mismatch' when the field is NCLOB.
>>
>>
>>
>> I am sure you will not catch an ugly disease from posting
>> the character sets in question?

>
>
> Java uses it's standard UCS2 character for all strings, and our data is
> in that format.
>
> The databases we've tested on are:
>
> Oracle 9i, with a database using WE8MSWIN1252 charset. We found this
> setup is useless, as all international characters will be discarded
> during the translation process, dispite using NVARCHAR2/NCLOB fields.
>

Still, that is not a client setting.

The MSWIN1252 means nothing; NCLOB/NVARCHAR fields would be either UTF8, or AL16UTF16 (default).
Besides, I do not understand why using NVARCHAR/NCLOB fields, when the standard counterparts would do, too. The reference to WE8MSWIN1252 makes me believe you do not need to store multibyte characters, like japanese, Chinese or Korean.

> Oracle 9i, with a database using the AL32UTF8 charset. This works,
> international characters are retained in both NVARCHAR2 and NCLOB
> fields. NCLOB fields however only work with small sizes. Above 32K
> we're forced to use the CLOB.createTemporary() construction (instead of
> setString() from PreparedStatement) but that results in the problem
> detailed in my initial post.

Strings in Oracle go up to 32k, and that only within PL/SQL. String storage is 4000 bytes (!) max, or 4G, if you use LOBs.

>
> Oracle 10g, with a database using the AL32UTF8 charset. The code
> example works without a problem.
>
> All setups were tested using the latest Oracle 9i and Oracle 10g thin
> JDBC drivers (this includes using the 10g driver on a 9i database and
> vice versa).
>
> Using normal CLOB fields in Oracle 9i also has no problems whatsoever,
> apart from the problem that international characters are discarded.
>

>> Client side, as well as NCLOB.
>>
>> Just a stab in the dark - you are aware of the fact, java only
>> supports very few character sets?

>
>
> Yes, but I donot see how it is relevant what Java supports, when the
> Oracle Database is the one that is unable to convert the standard java
> character set to one of its own.

I could just as easily state you do not seem to grasp the concept of character sets within oracle, and what character set is active in what case.
As I am not very much into java, I will not.
>
> This is very likely a driver related problem, as it is able to carry out
> the character set translation correctly when using setString of
> PreparedStatement. This function is however severely limited in the
> size of the NCLOB you can set (this is an Oracle specific limit, other
> databases donot impose such a limit in their drivers), and you need to
> switch to using streaming functions to use very large CLOB's/NCLOB's.
>
> A temporary table is used by the driver where the CLOB data is streamed.
> The CLOB's in this table are then referred to in the INSERT statement
> (this makes sense when you are inserting or updating multiple records
> with one statement that modify a CLOB field).
>
> I suspect that the temporary table used to store CLOB's in Oracle 9i
> does not support NCLOB's at all (or perhaps only with a specific
> database character set). In Oracle 10g there's no problem whatsoever.
>
> I hope this clears it up a bit.
>
> --
> John Hendrikx

Here's what I think happens:
- you install, and create a database, using 9i with the WE8MSWIN1252 character set.
Without knowing the details, I am guessing from here on. - the environment setting for the client processes (e.g. sqlplus) are set to WE8MSWIN1252. This is done by the installer. - consequence is: any process connecting will have to be converted TO WE8MSWIN1252. A character set, *NOT* supported by Java. - If the above scenario is correct, that would also explain why the experiments with AL32UTF8 in either 9i, or 10g, work out OK.

Still, you have to understand, NVARCHAR, NCLOB and NCHAR fields obey the character set as defined in the 'create database ... national character set xxxxxx' command. All the other character fields will obey the 'character set yyyyy' part of the 'create database ... character set yyyyy national character set xxxxxx' command.

And I would use the clob handling routines when working clobs, and the string handling routines for handling varchars. Maybe an experienced java/oracle programmer can comment on that?

-- 
Regards,
Frank van Bortel
Received on Wed Apr 27 2005 - 07:55:50 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US