Oracle FAQ Your Portal to the Oracle Knowledge Grid

Home -> Community -> Usenet -> comp.databases.theory -> Re: So what's null then if it's not nothing?

Re: So what's null then if it's not nothing?

From: Hugo Kornelis <hugo_at_pe_NO_rFact.in_SPAM_fo>
Date: Mon, 21 Nov 2005 00:44:45 +0100
Message-ID: <>

On 20 Nov 2005 14:49:16 -0800, wrote:

>Hugo Kornelis wrote:

>> (snipped rest of Pick description - we were discussing NULL in SQL,
>> remember?)
>...and how it has been the subject of much debate and the source of
>great confusion for many people. It is a bad standard.

Hi Mike,

Because way too many people started writing half-baked study books about SQL when it became booming business. People who were just in it for the buck, who didn't take the time to really study the subject matter, and didn't have the patience to explain it properly.

"NULL means that the value is not known" is so much easier to write down and explain than the actual definition of NULL, and it also makes it so much simpler to explain the concepts and the logic of three-valued logic, that it's almost too tempting.

A lot of "cowboy authors" obviously didn;t resist that temptation. And other authors copied those descriptions (in their own words, of course) for their books. Ever expanding the number of books with incorrect or incomplete explanation of the concept of NULL. As shown by the very un-true statements you appear to have picked up here and there about NULL in SQL - probably from such books.

>I found this in the introduction to a book on samba:
>"A good standard survives because people know how to use it. People
>know how to use a standard when it is so transparent, so obvious, and
>so easy that it become invisible. And a standard becomes invisible only
>when the documentation describing how to deploy it is clear,
>unambiguous, and correct. These three elements must be present for a
>standard to be useful, allowing communication and interaction between
>two separate and distinct entities to occur without obvious effort.
>..... Clarity and unambiguity without correctness provide a technical
>nightmare. Correctness and clarity with ambiguity create maybe bits,
>and correctness and unambiguity without clarity provide a muddle
>through scenario."

So according to this quote, a "good" standard is "transparent, obvious and easy", and has documentation that is "clear, unambiguous and correct".

Well, the ANSI documents are far from clear and easy (perhaps because they strive too hard to be unambiguous?). And prohibitively expensive. I already discussed that many books that are clear and easy are also ambiguous or even incorrect.

As to the standard itself - if reality is not "transparent, obvious and easy", then I'll prefer a standard that allows me to model reality in it's full complexity over a "transparent, obvious and easy" standard that can't capture reality with 100% accuracy.

>There has been little or no significant debate about null in Pick

Probably because the market for Pick books has never been large enough to attract the "cowboy authors" mentioned above.

> No need. It's clear.

Is it? In this thread, I've seen you write that Null in Pick is equal to the empty set. But I've also seen you write that it's equal to the empty string.

> There now exists the possibility for
>confusion among Pickies now too though - owing to the erroneous
>definition of null given in the UniVerse reference manual and quoted
>elsewhere in this thread. The source of that confusion stems from an
>attempt to fit in with SQL's standard definition.

Calling someone else's definition "erroneous" is, IMO, highly arrogant. Not something I'd expect from someone who just took someone else to task for showing "arrogance" in his post and for "daring to talk down" to you.

>Look - my final word on this. Null is and should be defined as an empty
>set. Not missing or absent or unknown or anything else.

Even more arrogance.

NULL is used by many people in many contexts. The first character in the ASCII character set is called Null - do you propose that the ASCII character set replaces it's first character with an empty set? In C (and other low-level languages), a NULL pointer is a pointer to nothing. Do you propose that all C compilers are rewritten to change a NULL pointer to an ampty set?

(Yes, these were rhetorical questions)

Why can't you just accept that SQL defines and uses Null in a different way than Pick?

SQL has chosen a definition that enables one to distinguish between NULL and an empty string. Unless I misunderstood your previous message, Pick has chosen a definition that doesn't allow this distinction.

The Pick definition would be very inappropriate for a SQL database. It would be disastrous in situations where an empty string is valid data, but absence of data might occur as well. But since Pick uses a different model to store it's data, it doesn't need NULL to represent absence of a value. That gave the Pickies the freedom to give NULL a different definition - which they apparently did.

Neither SQL's nor Pick's definition is "right" or "wrong". They are different. SQL's definition is (IMO - YMMV) the best possible definition for an SQL db. Pick's definition might well be the best possible definition for a Pick db.

> I'll accept
>that it's not going to change in SQL-land - and that confusion will
>continue to reign. I'm aware of the problem now and can live with it.
>Really - it's not my problem.

And neither is it mine.

Best, Hugo


(Remove _NO_ and _SPAM_ to get my e-mail address)
Received on Sun Nov 20 2005 - 17:44:45 CST

Original text of this message