Re: Some Laws

From: Laconic2 <laconic2_at_comcast.net>
Date: Fri, 24 Sep 2004 15:39:07 -0400
Message-ID: <846dnQ49Utdr6sncRVn-pA_at_comcast.com>


"Aaron Sherman" <AaronJSherman_at_gmail.com> wrote in message news:a2d0070.0409231405.58422759_at_posting.google.com...
> "Laconic2" <laconic2_at_comcast.net> wrote in message
news:<ubmdncUAr--WSszcRVn-rg_at_comcast.com>...
>
> > Yeh. The guy who tried to convince me that Java introduced the
automatic
> > garbage collector. I think he had never heard of Lisp.
>
> There's a flaw in that reasoning.

What's the flaw?

>
> First, howver, just to disclose, I'm not a fan of Java. I like its
> object model to a point, and it certainly is a usability improvement
> over C++ (C++ has other advantages as well).

Well, I am sort of a fan of Java, even though I've never written anything in it. In addition to the GC business, it picks up on the concept of compiler/interpreter working in tandem, or the so-called "virtual machine". That idea was used way back in ucsd Pascal, but it's an idea that deserved resurrection. Particularly since it makes something else possible: portable object code.

Just as C made portable source code available to the "close to the iron" engineers, (like the engineers who wrote unix in C, and invented C in their spare time), so likewise Java extends portability to where it's useful on the web.

>
> Ok, on to your comment. Yes, garbage collection has been around for
> decades, and no one who knew more than 3 languages was shocked when
> Java came out and had garbage collection. Perhaps, some of us were
> shocked that it was as agressive a GC scheme as it was, since we'd
> come to think of all things C-derived as "close to the metal", but
> that was just a mental block, not a valid observation.

Agreed.

>
> However, the reason that Java (rightly) gets much of the credit here,
> is not that there wasn't amazingly valuable work done previously, but
> that Java managed to present GC in the context of a language that in
> turn presented all of the right bits that made millions of computer
> programmers want to use it.

OK, as a consolidation with other ideas that, together make up a sufficient excuse for "yet another programming language", I have to buy what you say. I'm just objecting to the people who remain totally unaware of the "legacy" of earlier work.
More on "legacy" later.

>
> If you're interested in language design for pure research purposes,
> sometimes (and only sometimes) adoption doesn't matter, but when you
> want your language to be able to address the broad tapestry of
> software engineering and computer science tasks, you really do need to
> tackle the adoption problem.
>

Agreed.

> Java brought GC to the great mass of software engineers, and the only
> language to come within a couple *orders of magnitude* of that
> accomplishment previously was elisp (of EMCAS fame).

Someday the same thing will be said of Java. People will say, "less than one person in a thousand learned to program in Java. Once farfle came along, programming was at last within the reach of the common man."

Of course, farfle is going to have to be awfully simple. I remember all the people who couldn't program their VCR to do anything but blink "12:00" at the user.

BTW, did you mean "EMACS" fame? I'm unfamiliar with "EMCAS".

>
> Now, since this is a databases theory group, let me try to drag us
> back to topic... Java's success can be explained by a combination of
> timing, corporate acceptance of Sun as a backer and a widely needed
> combination of features. The first two are ignorable for my point, but
> the third seems to be to be a problem for which the relational model
> is idealy suited to describe. Has anyone ever tried to model the
> feature-sets of various programming languages (in terms of semantics
> and tools) and the needs of the environments in which they are used,
> and abstract from that the feature set of a hypothetically ideal
> language for each environment?
>

You have to add the web, although maybe that's part of "timing". Java was a natural for the web.

> I wonder what that would look like....

Me too.

Back to "legacy". As long as the word "legacy" is used as a pejorative in our field, our field is condemned to remain forever immature. I imagine that "legacy" was originally coined as a euphemism. But like all substitutes for pejoratives, it eventually became a pejorative itself.

Our legacy includes Shannon and Turing, Von Neumann, Nash and Wiener, Dijkstra, Knuth, and Wirth. Vannevar Bush, Eckhart and Mauchly. Grace Hopper. Ed Codd, Date and Darwen. And so on, and so on. Why do we continue to let people graduate believing that the "stone age" ended 6 years ago. Received on Fri Sep 24 2004 - 21:39:07 CEST

Original text of this message