Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.misc -> Re: NLS_LENGTH_SEMANTICS=CHAR and column storage size

Re: NLS_LENGTH_SEMANTICS=CHAR and column storage size

From: Laurenz Albe <invite_at_spam.to.invalid>
Date: 03 Nov 2005 09:25:12 GMT
Message-ID: <1131009911.422061@proxy.dienste.wien.at>


Frank van Bortel <frank.van.bortel_at_gmail.com> wrote:

>> My company considers using UNICODE databases with
>> NLS_LENGTH_SEMANTICS=CHAR to avoid problems inserting a value like
>> 'Länge' into a CHAR(5) or VARCHAR2(5) column.
>>  
>> Now I am concerned about space requirements of such a database
>> because I don't understand enough about the storage of such columns.

>
> In addition to what Sybrand already explained, if you are really
> stuck on UTF, take UTF16 - it's fixed 2 byte, and closest to
> what MS uses for the recent Windows products (assuming many
> applications will have a Windows front end).
> Meaning every character stored uses 2 bytes of space.
>
> By why bother at all?!? Disks are cheap

Thank you everyone for your replies.

I will consider using a single byte character set. What made be think of UNICODE is that Oracle seems to recommend it for all purposes.

There were 2 conflicting opinions: one that 'Länge' will take 20 bytes in a CHAR(5 CHAR) column, and one that it will take 6. I'll search the documentation again; maybe I can find an answer.

You suggested UTF16 - but UTF16 is only available for NCHAR columns.

We are migrating existing databases and don't want to change the column definitions so that applications continue to work.

My concern about size is not because of the cost of disk space (although a SAN is not exactly cheap) but rather because of the duration of backups.

Yours,
Laurenz Albe Received on Thu Nov 03 2005 - 03:25:12 CST

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US