Re: XML: The good, the bad, and the ugly

From: Laconic2 <laconic2_at_comcast.net>
Date: Tue, 19 Oct 2004 11:41:59 -0400
Message-ID: <rLmdnbeok4RJqOjcRVn-jg_at_comcast.com>


"Marshall Spight" <mspight_at_dnai.com> wrote in message news:Kfadd.268949$MQ5.40263_at_attbi_s52...
> "Laconic2" <laconic2_at_comcast.net> wrote in message
news:yuqdnX9nnbV2ZOncRVn-hA_at_comcast.com...
> >
> > "Marshall Spight" <mspight_at_dnai.com> wrote in message
> > news:WR1dd.418094$Fg5.11784_at_attbi_s53...
> >
> > > Agreed! The way LISP people respond in newsgroups has
> > > been an enormous turn-off for me, and has probably led
> > > me to pay less attention to the language than it deserves.
> >
> > Well, I probably paid less attention to Java than it deserves!
>
> Java is fairly cool. It's maybe not the the most boffo thing even
> to come down the pike, but it's cool. One thing about Java:
> we can plot the "uptake curve" of a language as a measure
> of popular success-- and Java has the steepest, highest uptake
> curve of any programming language. It rose from nothing
> to the number two spot for new development in about 5
> years, something no other language has ever done. Java
> deserves attention even if for no other reason than that.
>

Another language with a steep uptake curve was HTML.

A device with a steep uptake curve was the FAX machine. That is, the uptake curve was steep after the
body of existing FAX users reached "critical mass". Prior to that, the uptake curve was abysmally slow.

>
> > I'm not sure just what you mean by "compile time".
>
> It's a fairly standart term; I'm surprised at your unsurety. Let's say
> "ahead of time." A program is statically typed if you can run it
> though an analysis phase (which is typically a fast, guaranteed-to-halt
> process) and *prove* the absense of a certain broad category of
> errors.
>

I'm sure what you mean by "compile time" in languages like COBOL or C++, and dozens in between.

The reason I'm not sure what "compile time" means in the context of Lisp is that lisp is an interpreted language. I guess some variants of Lisp were made to be compiled, but I'm thinking of classic Lisp, which was interpreted.

Then there's Java which is a little of both.

>
> > If you mean the
> > conversion of the character based expression of the same language into
the
> > internal tree representation of the same expressions, that's not
generally
> > referred to as "compile time".
>
> Uh, it certainly "generally" is. The term is used that way in every
textbook
> I can think of, and in every programming system I've ever used; on every
> newsgroup and on every discussion board. That may not be universal,
> but it is certainly "general."
>
Again, I meant "generally" with regard to Lisp.
>
> > The LISP people generally called that function the "loader". I'm not
sure
> > what the formal name was.
>
> Ah, well.
>
>
> > The loader had to understand types. Literal values in the source code
had
> > to be converted to objects of the correct type.
> > As far as "atoms" goes, the only Lisp variant I knew well enough to
comment
> > on, didn't do it the way any compiled language would have. The type of
an
> > atom was "atom". The type of the contents of the atom went with the
> > contents, not with the atom.
>
> This is not static analysis. This is a description of an untyped language,
> or what is popularly called "dynamically typed"-- which is a way of
> saying type errors are detected at the last possible moment.
> I consider this a defect in the design, although this issue is by no
> means settled; there are vocal advocates of dynamic typing, a
> few of whom I respect.

Since I'm thinking of Lisp as interpreted, you could call that "dynamic code generation" if you like.
And, dang, if you do that, then it's dynamically typed.

> Yeah; I dunno. It never seemed to me like the term "polymorphic" could
> propertly be applied to code that had no static type information, because
> it couldn't be said to have a "morph" in the first place.
>

On the contrary. Unless AFAIK polymorphism in OO systems is implemented by deferring the binding between operator and method until run time. Am I wrong, here?

>
> This isn't a particularly impressive claim; languages that do not do
> at least this corrupt data and continue.
>
I wasn't trying to impress. SQL does type checking at run time. At least all the implementations of SQL that I've worked with. If not, a query that copied an integer in the database into a floating point variable would make a real mess! Are there any implementations that are that bad???!!!
> > > I'm not sure if XML makes the grade. It's designed for storing data.
> >
> > I disagree, but I could be wrong.
> >
> > I guess I'm going to have to read up on what the designers intended.
>
> I'm going to backpedal here a bit, and admit that the original intent
> of the designers is not ultimately of tremendous importance; the
> analysis one does in hindsight is generally better.

Well, in this case, I'm going to insist that if the original designers were building something for "data in transit" and the devotees used it for "data under management" that it was the devotees who turned a minor success into a major disaster, and not the designers.

BTW, one of the criteria for a "true hacker" some 35 years ago was the ability to use a system in a way that the system designer would have specifically disallowed or disavowed. With regard to security systems, the implications are obvious.
But using a programming language to accomplish a programming challenge that the authors of the language would have called impossible falls under a "neat hack"!

In that sense, I would call the programming examples that go with Joe Celko's early article in "Intelligent Enterprise" magazine, as a "neat SQL hack". Except I think he's take it as an insult, rather than a compliment. Received on Tue Oct 19 2004 - 17:41:59 CEST

Original text of this message