Re: Normalize until neat, Automate until Complete

From: Kenneth Downs <firstinit.lastname_at_lastnameplusfam.net>
Date: Wed, 24 Nov 2004 22:21:28 -0500
Message-ID: <pmvf72-di5.ln1_at_pluto.downsfam.net>


Jan Hidders wrote:

> Kenneth Downs wrote:

>> 
>> hmmm, I must be one of those practitioners who has only a shallow
>> understanding of the theory.  As far as I can tell, relational algebra
>> and relational calculus are both about querying existing data (though I
>> am willing to be corrected).

>
> It's rather hard to query non-existent data.
>
>> Under what terms might I find the theoretical
>> foundation for the *generation* of information out of other information?

>
> Depends on what you mean by theoretical foundation. Defining derived
> data is in essence no different from querying data. Why do you think
> some special distinct "theoretical foundation" would be required here?
>

I would suppose I mean a system of primitive operations that can be used to compose data of arbitrary complexity which will follow predictable behaviors. I would assume the picking those primitives would take some thought, as opposed to simply saying, "hey, we have a speed problem on the gonkulator report, better materialize the commonly derived columns!" This approach, while being pretty realistic, offers no guidance or advice on how to avoid the problem in the future. Worse, the suggestion that these hard working folk will "normalize till it hurts" tends to guarantee they will get into trouble again and again.

In my search of Google I have found only one sentence on one page that offered advice on this, all others just said don't do it. Nevertheless, people here often have visited islands of information that are not readily visible in Google's sea of data.

-- 
Kenneth Downs
<?php $sig_block="Variable scope? What's that?";?>
Received on Thu Nov 25 2004 - 04:21:28 CET

Original text of this message