Re: Differences in Range Scan Performance between Binary and NLS Indexes

From: Pat <>
Date: Wed, 7 May 2008 12:29:06 -0700 (PDT)
Message-ID: <>

On May 7, 11:29 am, joel garry <> wrote:

> Well, maybe something else is going on, like applying some semantics
> to every row... been too long since I worked with this stuff to do
> anything but guess.
> jg
> --
> is bogus.
> Now not only do you drop your keys or phone in the toilet, it
> automatically flushes if you stand up to get them.

Thanks for the update, I was wondering the same thing e.g. if running in linguistic mode was dramatically increasing the cost of any index comparison. Almost smells like he's running a regex or something instead of some form of bitwise compare (pure specualation on my part).

Main thing that's bugging me is that running in nls_mode, even with nls_indexes, seems to utterly torch my performance (order of magnitude slower). I've seen a number of other comments out of the web to the effect of "oh, yea, don't use nls mode, performance sucks".

All the Oracle doc though seems to be saying "use nls mode, it's a wonderful feature that we're recommending".

I can't seem to find any kind of official statement from Oracle as to whether I should *expect* bad performance like this in nls mode, or if this is a bug I'm hitting, or if I'm doing something wrong.

At this point, I've tentatively decided that performance in nls mode is just plain bad and there's nothing I can do about it other than running the session in nls_comp=binary mode and using lower() on my various query terms to get the right results back (in us english at least). I'd love to be proven wrong here though. Received on Wed May 07 2008 - 14:29:06 CDT

Original text of this message