Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: NCLOB fields and CLOB.createTemporary does not work

Re: NCLOB fields and CLOB.createTemporary does not work

From: John Hendrikx <hjohn_at_please.remove.this.xs4all.nl>
Date: Wed, 27 Apr 2005 14:00:09 +0200
Message-ID: <426f7ecb$0$159$e4fe514c@news.xs4all.nl>


Frank van Bortel wrote:

> John Hendrikx wrote:
>
>>I'm using Oracle 9i, using the latest thin JDBC Java driver. I'm trying
>>to insert a NCLOB into the database. The recommended way to do this is
>>to use CLOB.createTemporary() as you can see in the code below. This
>>code works fine for CLOB fields, but fails with a 'ORA-12704: character
>>set mismatch' when the field is NCLOB.
>
>
> I am sure you will not catch an ugly disease from posting
> the character sets in question?

Java uses it's standard UCS2 character for all strings, and our data is in that format.

The databases we've tested on are:

Oracle 9i, with a database using WE8MSWIN1252 charset. We found this setup is useless, as all international characters will be discarded during the translation process, dispite using NVARCHAR2/NCLOB fields.

Oracle 9i, with a database using the AL32UTF8 charset. This works, international characters are retained in both NVARCHAR2 and NCLOB fields. NCLOB fields however only work with small sizes. Above 32K we're forced to use the CLOB.createTemporary() construction (instead of setString() from PreparedStatement) but that results in the problem detailed in my initial post.

Oracle 10g, with a database using the AL32UTF8 charset. The code example works without a problem.

All setups were tested using the latest Oracle 9i and Oracle 10g thin JDBC drivers (this includes using the 10g driver on a 9i database and vice versa).

Using normal CLOB fields in Oracle 9i also has no problems whatsoever, apart from the problem that international characters are discarded.

> Client side, as well as NCLOB.
>
> Just a stab in the dark - you are aware of the fact, java only
> supports very few character sets?

Yes, but I donot see how it is relevant what Java supports, when the Oracle Database is the one that is unable to convert the standard java character set to one of its own.

This is very likely a driver related problem, as it is able to carry out the character set translation correctly when using setString of PreparedStatement. This function is however severely limited in the size of the NCLOB you can set (this is an Oracle specific limit, other databases donot impose such a limit in their drivers), and you need to switch to using streaming functions to use very large CLOB's/NCLOB's.

A temporary table is used by the driver where the CLOB data is streamed.   The CLOB's in this table are then referred to in the INSERT statement (this makes sense when you are inserting or updating multiple records with one statement that modify a CLOB field).

I suspect that the temporary table used to store CLOB's in Oracle 9i does not support NCLOB's at all (or perhaps only with a specific database character set). In Oracle 10g there's no problem whatsoever.

I hope this clears it up a bit.

--
John Hendrikx
Received on Wed Apr 27 2005 - 07:00:09 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US