Re: Bob's 'Self-aggrandizing ignorant' Count: Was: What databases have taught me

From: Bob Badour <bbadour_at_pei.sympatico.ca>
Date: Thu, 29 Jun 2006 08:24:51 GMT
Message-ID: <n7Mog.3572$pu3.87240_at_ursa-nb00s0.nbnet.nb.ca>


Keith H Duggar wrote:

> Bob Badour wrote:
>

>>Keith H Duggar wrote:

>
> First, your response to Adrian Alston was excellent. I'm
> finding your perspective on OO quite insightful. You're
> helping me to better understand some of the annoyances
> that started driving me away form OO in the first place.
>
>>>Ok. At the moment I have to agree that "just an
>>>arbitrary and ad hoc" is this far unjustified. I don't
>>>know anything about the origins of Simula nor of those
>>>early days of OO.  So maybe Bob is right; but, yes he
>>>hasn't demonstrated the "arbitrary and ad hoc" yet.
>>
>>It is a falsifiable statement. I leave it to the reader to
>>verify for himself or to contradict for himself. If the
>>creation was not arbitrary, what principles drove the
>>design of language features? If the creation was not ad
>>hoc, what drove the need for the computational model?

>
> Were you asking about OO generally or C++ specifically? I
> can't comment on the general origin of OO. If the creation
> was driven by principles and if OO was meeting general
> needs, hopefully someone will explain so here.

I really don't think it was principle driven and certainly not theory driven. Some principles may have shaped things a little, but both OO and C++ strike me as rather arbitrary and ad hoc. If there was a theoretical bases for any part of OO, it was probably a theoretical basis for Tony Hoare's record classes.

Sadly, I have not found an online version of Hoare's paper, which is a shame because it was such an important founding document for an entire genre of programming languages. I only know about it from references Dijkstra made to it.

Contrast that with Codd's mapping an entire existing branch of mathematics to a general problem as much for theorizing about the problem as for creating software. The later work on normalization, for instance, and the proof of the equivalence between set theory and predicate calculus prove the success of that approach.

>>>Furthermore, in the case of C++ and it's particular OO
>>>concepts, I don't think "arbitrary and ad hoc" is
>>>justified. Stroustrup put a great deal of thought and
>>>design into the C++ version of OO concepts. As did many
>>>other people during it's evolution.
>>
>>Can you enumerate the principles used and the impetuses
>>for inclusion of new features? Did a concern for
>>correctness ever drive the addition of a feature? Did any
>>theory or branch of mathematics drive the addition or
>>removal of any feature?

>
> Stroustrup's D&E book covers this quite well. As the title
> suggests he covers the design and evolution of a number of
> C++ features.

I have a copy of it somewhere. I will have to open it sometime and read it more carefully.

  Generally he enumerates the principles (lack
> of overhead, compatibility with C, ease of implementation,
> correctness, etc) and the impetuses (efficiency, type
> safety, expressivity, etc) behind features.

But as I recall, he mainly created C++ for the advantages it offered for creating simulations.

  And yes (if I
> understand you correctly) the concern for correctness did
> drive the addition of some features. For example, the new
> C++ style cast operators were meant to improve type safety
> over C style casts, to discourage casting, and to make their
> use more conspicuous.

I suppose it is arguable whether that is ad hoc. I tend toward the position that it is ad hoc. "We notice a problem with A so let's add B." That is different from: "Let's design a type-safe language."

  And no it doesn't seem (at least from
> D&E) that any theory or mathematics drove the addition or
> removal of features. But that's just my reading, perhaps
> I missed something?

I doubt you missed anything.

> Did I understand your questions properly? If not perhaps
> you or someone else who has read D&E and/or has more direct
> knowledge can answer with respect to C++ better than I.

It sounds ad hoc to me. Whether that is arbitrary is arguable, I suppose.

>>OO is just an arbitrary and ad hoc collection of
>>features. Those features are useful for creating large
>>unpredictable state machines out of small predictable
>>state machines.

>
> It's the second sentence that I find insightful, even though
> it's more of a simple historical fact as you point out. The
> OO feature set drawn from a genesis in simulation does, as
> you cogently explain in your response to Adrian, seem to
> partly influence thinking in ways antithetical to good
> software design.

At the same time, the observation offers some insight into the risks of OO and suggests at least one criterion for good OO design: avoid excessive or overly complex state.

>>That they are ad hoc is supported by the origin of OO in
>>Simula, which was not created as a general programming
>>language. It was an ad hoc simulation language that was
>>later used for other things too.

>
> That's very interesting. I know nothing of the origin of
> Simula and should definitely learn more. Do you have any
> good references by chance?

Not really. If you do some googling, you will find some sketchy details. Dijkstra mentions it in his EWD's in reference to some of Tony Hoare's work on dealing with pointers inspiring the development of Simula. One of the people who worked on Simula published a very broad history of its development.

>>That the features are arbitrary is supported by the
>>prevalence of OO languages omitting one or another feature
>>from the computational model or adding one or another
>>feature to the computational model.

>
> There does seem to be disagreement and confusion in the OO
> community about which features are OO. Witness the various
> unresolved flame wars about which languages are "OO" "truly
> OO" etc. In contrast, it seems quite well resolved and clear
> that SQL for example is not "truly relational". Perhaps and
> "object mentor" can step in and provide the current clearly
> expressed well-defined OO feature set consensus.

The RM has controversial areas too. However, the areas of controversy are specifically those areas where we lack any sound and useful theory--most notably the area of missing information. There is widespread agreement that data are represented as values in relations, that integrity constraints are expressible as wff's, that an RDBMS must use some language as expressive as the relational algebra or equivalently the relational calculus.

>>That the features are useful for creating large
>>unpredictable state machines is supported by too much
>>evidence to enumerate so I will merely point to the ad hoc
>>need that drove the creation of Simula in the first place.

>
> Small nitpick to check we have the same understanding of "ad
> hoc". Should that have been "the need that drove the ad hoc
> creation of ..."? I'm not seeing how needs can be ad hoc.

http://dictionary.reference.com/browse/ad%20hoc

adj. "Formed for or concerned with one specific purpose"

adv. "for the particular end or case at hand without consideration of wider application"

The needs are not necessarily ad hoc. Whether a feature was added in ad hoc fashion depends largely on the needs driving one to add it. "We tried to do A and were frustrated by our tool so we added B to the tool" sounds ad hoc to me.

Likewise, "We need to make a simulation and it's hard to do in current languages. Let's make a special language for programming our simulation." is similarly ad hoc. Received on Thu Jun 29 2006 - 10:24:51 CEST

Original text of this message