Re: Problem with Unicode Strings

From: joel garry <>
Date: Wed, 16 Apr 2008 11:28:40 -0700 (PDT)
Message-ID: <>

On Apr 15, 8:26 am, "André Hartmann" <> wrote:
> "Carlos" <> schrieb im
> On Apr 15, 3:04 pm, "André Hartmann" <>
> wrote:>I cannot see the point of storing UNICODE (UTF8? UTF16?) in a
> >WE8MSWIN1252 database... (you don't specify the NLSupport codepage for
> >The DB codepage should be unicode  (AL32UTF8).
>   Hi, if that is so (unicode cannot be stored into databases that do not
> have a unicode character set) then why can I create tables with unicode
> columns (NCHAR, NVARCHAR2, NCLOB) in such databases? Wouldnt it be more
> appropriate then for Oracle to cast errors when trying so? To say it the
> other way round, the sheer fact that it is possible to declare unicode
> columns in the database implied to me that it is possible to store such
> values. Am I wrong here?

You are missing the point about Oracle being helpful and friendly and way, way helpful and way, way, way friendly about converting from one character set to another. Most tools honor that, but some don't. But the general way to not get the conversion is to have the proper NLS environment, as well as the proper character set. In general, using a character set that does not support what you are putting into it will cause you grief one way or another. Unicode is designed to handle any sets (subject to which Unicode you are using - there are many versiondependent  issues about that).

It is not an error because it is a feature for Oracle to be able to handle different character sets. That puts it upon you to understand the implications. Please read the docs about globalization, as well as the metalink docs that help you understand NLS. It can get quite involved, though usually the answer for a particular situation winds up being simple.


-- is bogus.
Received on Wed Apr 16 2008 - 13:28:40 CDT

Original text of this message