Re: Examples of SQL anomalies?
Date: Fri, 11 Jul 2008 17:40:45 GMT
David Cressey wrote:
> "Roy Hann" <specially_at_processed.almost.meat> wrote in message
>> A database design that presumes to model reality instead of modeling >> what we will actually say about reality is not fit for purpose. And if >> a model of the latter is more complicated, tough. Simple but wrong is >> not right.
> In my practice, what NULL meant, pretty much all the time, was roughly
> "In an ideal world, this NULL would not be here. Either the cell that
> contains it would be missing, or the cell that contains it would contain a
> value instead of a NULL. This isn't an ideal world. Let the reader deal
> with it."
> Many times, I dealt with schemas that I did not design, and applications I
> did not write. In those situations, NULL might have meant whatever those
> in control wanted it to mean.
I think a db scheme that is amenable to relational algebra must be an idealization. As Dr. Strangelove said, "that is the whole point". Doesn't matter whether the outside world is ideal or not.
Not saying people trying to make a living should decline work that involves db's that contain nulls. There are plenty of people willing to take their place. But they shouldn't pretend it is anything but a bastard as far as the RM proper is concerned. After all, db owners who aren't sure what values should be represented aren't likely to worry if they aren't sure what the answers mean.
In Codd's fundamental model, rows or tuples contain values. Null proponents take the algebra as is then upset the apple cart by pretending that rows can contain something other than values without changing their algebra, eg., join, union and especially project. The argument seems to boil down to "well, my programming machinery allows me to introduce this physical effect, so why not?".
I'm not sure whether Codd saw his 'marks' as values or not but suspect that he was deliberately stepping outside his algebra, adding a programming feature that wasn't part of the basic algebra. Like anybody, he couldn't ignore his influences, perhaps one was the hardware of the 1950's he was helping to design, which was replete with 'field marks', 'word marks' and so forth, all of which were accessibly both by hardware and programs. Personally, I don't find this any harder to reconcile than the presence of programmable exceptions in many current application languages. Strictly speaking, relational closure would mean that logical exceptions aren't possible in the first place, but many programming environments find it 'user-friendly' to short-circuit full and complete algebraic operations. In other words exceptions are themselves exceptions, ha ha. Received on Fri Jul 11 2008 - 19:40:45 CEST