Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Re: 16Bit character set question

Re: 16Bit character set question

From: Carsten Saager <carsten_at_saager.org>
Date: Wed, 28 May 2003 20:32:47 +0200
Message-ID: <bb2vdu$5896a$1@ID-74580.news.dfncis.de>

"porschberg" <thomas.porschberg_at_osp-dd.de> schrieb im Newsbeitrag news:8d9566f5.0305280013.7ee10814_at_posting.google.com...
> char-array/std::string (C++/OTL), character string (perl) and
> String (java) and checked the length of this fetched strings.
> The result is always 2 and not how I expected 4.

Java "char" is UTF-16 (aka a "short"), hence your Katakana/Kanji charaters will consume 1 char each.

>
> Further I checked the numeric values for the single bytes in
> the C/Perl string. The unsigned value is identical (191) for
> both bytes.

Looks like the "replacement character" that is transferred if your client is not set to charset that does not contain the characters received

> What must I be aware on client side (OCI/OTL,DBD::Oracle,JDBC)
> if the database stores its VARCHARS in multi-byte?

Make sure that NLS_CHARSET-Parameters on the client are set to a characterset that is the same or a superset of the one you run your databas with.

Regards,

Carsten Received on Wed May 28 2003 - 13:32:47 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US