Re: The naive test for equality

From: vc <boston103_at_hotmail.com>
Date: 10 Aug 2005 07:11:18 -0700
Message-ID: <1123683078.904609.196930_at_g49g2000cwa.googlegroups.com>


David Cressey wrote:
> "vc" <boston103_at_hotmail.com> wrote in message
> news:1123619409.376468.14900_at_g44g2000cwa.googlegroups.com...
>
> > So now we have, in addition to 'representation', a new word 'symbol'.
> > What is even worse, in your vocabulary, it means two different things.
> > Nice..
>
> Nonsense. The word "symbol" has been used to describe the numbers managed
> by a computer ever since Ada Lovelace commented on the possible uses of
> Babbage's analytical engine. Just because the IT world has forgotten its
> own history doesn't meant this stuff is new.
>
> And, I'm using "symbol" with a consistent meaning, as far as I can see.

You wrote previously:

" The word "symbols" refers not only to the symbols used to exchange data
 between people and computers, but also to each of the data items inside the
 computer
"

To rephrase, you defined the word "symbol" as:

  1. something used for human consumption, presumably a string of characters on paper used to name the thing humans work with ;
  2. internal implementation of the apparently same things the computer works with ;

Now, you are giving another, even vaguer definition of "symbol", namely, as "numbers managed by a computer". So, which one out of three is it to be ?

What's interesting, whilst talking about symbols, representations and such, you've forgot about the real thing, the value, which is of primary interest for modelling, with the value's name and implementation being important but secondary considerations. (I'd speculate that, in your vocabulary, a name corresponds to a representation and implementation to internal representation. I am not sure where the word "symbol" fits in, though.)

>
> >
> > >In other words, what the computer stores is all symbolic, right
> > > down to the most atomic symbols, zero and one.
> >
> > This is not true. What the computer uses to store numbers (and
> > characters) is called bits, not symbols. Besides, the way the
> > computer implements numbers and characters is entirely irrelevant at
> > the logical level.
>
> Bits are symbols.

If you mean that bits are characters on a piece of paper (or a screen),  then, yes. Is that what you mean ?

>
>
> And what makes you think the logical level is the only level?

Since I've never stated that "the logical level is the only level", the above pharse is called a non sequitur.

>
> >
> > >
> > > When various "engines" (or "objects" if you prefer) inside a large
> system
> > > exchange data with each other (or "messages" if you prefer), they use
> > > symbols to communicate with each other.
> >
> > This phrase is so ambiguous as to be almost devoid of meaning. What are
> > "engines" and how do they "exchange data" ? What precisely do you mean
> >
> > Hardware components ? Abstract stuctures communicating using some
> > protocol ? Or something else ?
> >
>
> All of the above.
>
>
> I give up.

Can your response be interpreted as your having no idea as to what you've been talking about in the above passage ? Received on Wed Aug 10 2005 - 16:11:18 CEST

Original text of this message