Re: Extending my question. Was: The relational model and relational algebra - why did SQL become the industry standard?

From: Bob Badour <bbadour_at_golden.net>
Date: Tue, 11 Mar 2003 15:35:45 -0500
Message-ID: <Jdsba.116$Rj7.6721596_at_mantis.golden.net>


"Lauri Pietarinen" <lauri.pietarinen_at_atbusiness.com> wrote in message news:3E6E29F0.1050601_at_atbusiness.com...
> Bob Badour wrote:
>
> >>The context in which this question arose was the analogy that Lauri made
> >>between GOTO vs. WHILE and tuple bags vs. tuple sets and the main issue
> >>
> >>
> >was
> >
> >
> >>whether the analogy is a valid one, esp. w.r.t. query optimization.
> >>
> >>
> >
> >It was a valid analogy of increased abstraction. Perhaps Lauri does not
know
> >enough about compiler optimization techniques to realize that the
specific
> >analogy does not offer any optimization advantages, which means the
analogy
> >may have been a poor one. Then again, I seem to recall that Lauri already
> >admitted as much.
> >
> What I had in mind was some of the "nice" tricks you can do in PL/I, like
>
> DCL I INTEGER;
> DCL L LABEL(10);
>
> LABEL(1) = THIS;
> LABEL(2) = THAT;
> LABEL(3) = HERE;
> LABEL(4) = THERE;
> ... etc...
>
> I = MYFUNCTION(5);
>
> GOTO LABEL(I);
>
> THIS: PUT ('HERE I AM');
> RETURN;
> THAT: PUT('THERE I AM');
>
> GOTO THIS;
>
> etc...
>
> But, I suppose those are just a piece of cake to implement
> (and optimize?) as I have been kindly instructed...

The above PL/I example looks more like a case statement than a while loop. No matter how good the language, though, it cannot prevent a determined and clever programmer from obfuscating code to the detriment of both human and machine understanding. Heck, just look at some of the "nested sets" crap that appears online from time to time.

I am not sure where my "dragon" book is. It's been years since I last read it, but I seem to recall the authors remarking that algorithmic replacement is the optimization offering the greatest benefit. Unfortunately, it is also the most difficult to implement for imperative procedural languages. (Read: It ain't ever gonna happen.)

I seem to recall that the example given was the difficulty in identifying an inefficient algorithm, bubble-sort for instance, and knowing the available replacement algorithms, quick-sort for instance. Just think of all the ways--more than a few good and more than a few bad--for expressing bubble-sort as a sequence of steps.

When the level of abstraction rises above algorithms, algorithmic replacement becomes algorithmic choice. Instead of describing a sequence of steps implementing some sort, if we simply ask for an ordered result, we can automate the choice of algorithm to get the desired result for the least cost. For instance, depending on the cardinality, the compiler can choose different sort algorithms or detect that the result is already sorted without any effort at all. Or the choice of algorithm might depend on whether the penultimate result is partially ordered.

Increased levels of abstraction improve human understanding regardless of any opportunities for optimization. Increased levels of abstraction allow us to express what we want much closer to the level of intent, which is necessarily clearer for us as humans. We express what we want and let some automated machine figure out how to do it. (I believe this is the inspiration for the title of Date's book _What Not How_.)

Even though the while/goto examples do not offer any compiler optimization opportunities, the while loop's higher level of abstraction does optimize things. The while loop allows us to express the repetition a step closer to our level of intent. This has advantages because the repetition is not immediately apparent from Jan's if/goto example and will require some thought and analysis to understand.

The while loop is also less error-prone. Consider the size of the change and the difference in behaviour if the GOTO 1210 in Jan's example were erroneously entered as GOTO 1200. It completely changes the algorithm but hardly affects the appearance of the code at all.

The WHILE loop requires less human interpretation and introduces no "off-by-one" risk for missing the proper boundaries of the loop. Even though the WHILE loop offers the compiler no optimization advantage, it does optimize the human resource.

The set-based relational model optimizes both human and machine resources relative to bag-based models.

Jan is just being contrary to suit his emotional needs. Received on Tue Mar 11 2003 - 21:35:45 CET

Original text of this message