Re: no names allowed, we serve types only

From: Keith H Duggar <>
Date: Sun, 21 Feb 2010 17:07:32 -0800 (PST)
Message-ID: <>

On Feb 21, 2:33 pm, Tegiri Nenashi <> wrote:
> On Feb 21, 3:05 am, Jan Hidders <> wrote:
> > On 20 feb, 03:40, David BL <> wrote:
> > ... E.g. a relation has an attribute
> > > containing circles and you must allow it to be addressed using either
> > > circle or ellipse.
> > Indeed. But the header would contain only Ellipse, and all subtypes,
> > including Circle, would be implied. ...
> Ellipse-Circle example is unconvincing. Both are conic sections and it
> is natural to suggest that the design would greatly benefit from
> introducing a single class instead of many. The only objection is that
> certain methods being constrained to subtypes (such as Circle) might
> greatly benefit in performance. However, this is rarely a concern in
> practice with so called "object-oriented design" methodology, where
> not much thought is put into creating a wealth of new classes.
> The situation is mirrored for physical units. Here in the US the
> debate is still imperial vs. metric, where "more educated" crowd
> points out that metric is certainly superior because scientists are
> using it. Which scientists? It is as early as at physics undergraduate
> level that one learns that SI is not used in physics anymore and a
> system with 3 basic units (cm-gm-sec) is certainly superior. Later on,
> on theoretical physics level, this system is dumped in favor of
> dimensionless units where all fundamental constants are set equal to
> 1.
> Therefore, both types and units are seems to be artifacts of our
> limited perspective. As soon as we get better knowledge we get rid of
> them.

There is just to much vagary above for me to comprehend salient concrete points. For example "greatly benefit", "performance", "certainly superior", "limited perspective", "better knowledge"?

Out of curiosity, do you have any practical experience programming with strongly typed languages? Can you not think of even a single useful aspect of types in THAT context? You mention performance yet you fail to mention some of the other big (arguably more important) benefits such as type safety (entire categories of bugs eliminated from possibility), semantic expressiveness, etc.

> Math is different story. Types were created as a vehicle to avoid
> certain paradoxes. A typical view of somebody working in applied
> sciences is that borrowing a tool designed to avoid some obscure
> theoretical constructions is ridiculous.

Sorry but the only thing ridiculous is your assertion that "types were created ...". "type" is just a word and the concept of types dates back to antiquity. For example the notions of "kinds" aka "types" of numbers such as "integer", "rational", "real" predates Russell's particular use of the word by thousands of years. Other more closely related (to the concept of type were are discussing here) work such as group theory and abstract algebra also have roots centuries older. Finally, humans have had the concept of "kinds" of things probably from the moment a human first realized their hand had a different purpose than their foot and that beef doesn't taste like chicken.

So the main source of this ranting above seems to be a confusion that when a computer scientists thinks of data types that they have anything at all in their mind related to resolving Russell's paradox worries.

That both Russell and computer scientists find the same tool, types, useful for similar purposes, constraining language syntax (Russell for the languages of logic, a computer scientist for programming languages) is of superficial if any importance.

> To cite E.T.Jaynes, Appendix B
> Formalities and Mathematical Style who quotes Henri Poincare (1909):
> ""In the old days when people invented a new function they had some
> useful purpose in mind: now they invent them deliberately just to
> invalidate our ancestors' reasoning, and that is all they are ever
> going to get out of them."
> Indeed, this fad of artificially contrived mathematical pathology
> seems nearly to have run its course, and for just the reason that
> Poincare foresaw; nothing useful can be done with it."

Sadly (for the purpose of the rant) types have proven to be very useful for programmers. Though perhaps it is hard to appreciate this if you are a mathematician whose only experience with types are Russell's "levels" rather than say a working programmer with practical experience with strongly typed programming languages.

> A similar situation happened in mathematical foundation area where a
> wealth of paradoxes were created, and, unlike analysis, these
> pathologies were instrumental for axiomatizing set theory. It is
> remarkable that a construction, which was created in such peculiar

You are confused. The notion of type was not created by Russell and the notion of type employed in programming languages bear only superficial similarity to Russell's use of types.

> circumstances, is one of the most profound ideas in CS.

Yes, tudes are always amusing but are without doubt something that "nothing useful can be done with".

KHD Received on Sun Feb 21 2010 - 19:07:32 CST

Original text of this message