NVL() degrades the performance [message #329114] |
Tue, 24 June 2008 03:15  |
prachi.mgorwadkar
Messages: 75 Registered: May 2007 Location: Pune
|
Member |
|
|
Hello all,
I want to know, whether NVL() function degrades the performance.
So, NOT NULL constraint is always better?
Is it true, if so how it degrades the performance and why?
Thanks and regards,
Prachi G.
|
|
|
Re: NVL() degrades the performance [message #329116 is a reply to message #329114] |
Tue, 24 June 2008 03:22   |
pablolee
Messages: 2882 Registered: May 2007 Location: Scotland
|
Senior Member |
|
|
Not NULL and NVL are not comparible. They perform different tasks. Their only commonality is that they are designed specifically with nulls in mind.
Quote: | I want to know, whether NVL() function degrades the performance.
| Why not test it and see?
|
|
|
|
Re: NVL() degrades the performance [message #329570 is a reply to message #329114] |
Wed, 25 June 2008 21:57   |
TheSingerman
Messages: 49 Registered: April 2008 Location: Brighton, Michigan
|
Member |
|
|
This question reminds of a system I once worked on. There were a couple of tables with columns which logically should have been defined as "NOT NULL with a default value". Except that this was too much work for the designer. Instead, he let NULL represent the default value. Since the default value was not supposed to be NULL, this led to a lot of NVL and DECODE functions scattered throughout the queries.
All of which is a terrible practice. Here are a few of the reasons why:
Whatever function you use, be it NVL, NVL2, COALESE, DECODE, CASE, or something else, takes longer to run than a simple fetch of a data item.
The semantics of a predicate with a nullable column are different from one with a not null column, and the optimizer respects those differences. This means that some efficient execution plans will never be used -- even if the column never contains any nulls.
You will forget and leave out an NVL some day, thus getting a wrong result.
|
|
|
|