Oracle FAQ Your Portal to the Oracle Knowledge Grid
HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US
 

Home -> Community -> Usenet -> c.d.o.server -> Re: Surrogate numeric keys versus natural character keys.

Re: Surrogate numeric keys versus natural character keys.

From: <bob_jenkins_at_burtleburtle.net>
Date: 2000/08/12
Message-ID: <8n26gh$g4k$1@nnrp1.deja.com>#1/1

In article ?3991847F.85B3B5DC_at_hotmail.com?,   Alan Byrne ?albyrne5_at_hotmail.com? wrote: ...
? Lets say we have a table containing customer information as follows

?

? CREATE TABLE cust
? (
? cust_number VARCHAR2(30),
? cust_name VARCHAR2(100),
? CONSTRAINT cust_pk PRIMARY KEY (cust_number)
? )

...
? Oracle uses UNICODE to encode characters,
? so that means 16-bits per character, right?
...

I haven't seen it mentioned elsewhere in the thread: char and varchar2 are required to have the C compiler character set as a subset. That means if most of your data is alphanumeric it'll be stored as one byte per character. You need to use nchar/nvarchar2 if you want to persuade Oracle to store alphanumerics as 2 bytes apiece.

Also, Oracle uses the same comparison function for all datatypes -- one byte at a time. Oracle numbers are in a funny format that requires 3 bytes to store 1, 4 bytes to store 9999, and can be compared the way Oracle does comparisons.

Sent via Deja.com http://www.deja.com/
Before you buy. Received on Sat Aug 12 2000 - 00:00:00 CDT

Original text of this message

HOME | ASK QUESTION | ADD INFO | SEARCH | E-MAIL US