Re: The word "symbol"

From: dawn <dawnwolthuis_at_gmail.com>
Date: 15 Aug 2005 14:25:12 -0700
Message-ID: <1124141112.236046.31270_at_g44g2000cwa.googlegroups.com>


Marshall Spight wrote:
> dawn wrote:
> > Marshall Spight wrote:
> > > dawn wrote:
> > > >
> > > > The question of whether to model integers used within software as
> > > > subclasses of strings, for example, makes sense when we understand that
> > > > 1234 is not a number, but a symbol for one, just as "David" is not a
> > > > name, but a symbol for one (in response to VC's question about whether
> > > > "symbol" and "name" are synonyms).
> > >
> > > this is not correct, though. 1234 *is* a number.
> >
> > I disagree. Even if there were no symbol for the number 2, one could
> > still have 2 sheep (as in the number 2).

>

> You're mixing the levels again. 2 is a number-- it is not
> a part of the real world; it's a *concept.* Two sheep is not 2.

I agree that two is a concept and it is that concept that I would call a number. I can represent that number several different ways, one of which would be by writing a 2. The 2 that I write is not the number, just as "two" is not a number. These are symbols representing the number. The signifier and the signified are separate. Now, I can imagine that one could define the symbol of 2 to BE the number, in which case numbers would be symbols rather than concepts. That might be helpful in some cases, but in this case I find it helpful to recognize that computer data are symbols -- signifiers.

Just as the data model for a Product is not the product and the data values for Product attributes are not the product, the value of the attribute "quantity" of a product in an Order is not the quantity -- it is a symbol for the quantity. Although it would be a bit perverse, we could decide that if the symbol 3 is a value, it means a quantity of 3 million. And I suspect that we have all see where values such as 999 are signifiers for something other than nine-hundred ninety-nine.

>

> > It is only when I need to
> > communicate this information that I need a symbol or signifier of some
> > kind. That could be a sound or visual symbol.
>
> Sure. The symbol and the thing it symbolizes are not connected.

There is a mapping from one to the other -- they are connected by such a function, but that function is not in the computer -- it is in the human(s).

> In the case of numbers, they could not even be said to be on the
> same plane of existence. Numbers do not exist in *any* plane of
> existence; they are exclusively conceptual in nature.

They exist as concepts -- does that work?

> When you have
> two sheep, you have some animals; you don't then physically posess
> any numeric concepts.

If I have a quantity of 10 (symbol) items, then there are (hopefully) ten (conceptual) items. The concepts for numbers are part of the real world being modeled by the data model.

>

> > > '1234' is a symbol
> > > for a number. you have to be clear about the distinction.
> >
> > yes, indeed. There is the number 1234 and then the signifier of 1234
> > for this number and then a signifier for a string with the characters
> > '1234' which does not signify the number.
> >
> > > '1234'
> > > is a string, but 1234 is an integer.
> >
> > 1234 is a representation of an integer, a signifier for an integer,
> > although we often speak less formally and say that it IS an integer.
>
> No, that's not correct. 1234 is an integer.

You could define the symbol to BE the concept and, again, it might be helpful to do that. I don't see numeric data as being so drastically different from word data, however. I see both as symbols, representations, referring to something else. A user types a 2 into a data field -- they are thinking of the concept of two. The program can suck it in as a string '2' and if it passes the test as a valid symbol for a number, then it can be cast to a numeric type with a value of 2. It does not move from being a reference to a number/concept as entered by the user into actually being a number with the cast. In both cases, it is a symbol.

> When I am talking or writing, all of the words that I use are to
> be interpreted as the symbolized concept, and not as a word.

exactly

> When
> one wants to refer specifically to the un-dereferenced symbol, the
> convention in English is to put the word in quotes. Thus: 1234
> is an integer, but "1234" is a symbol.

1234 is a symbol of a number such as might be useful with a quantity and "1234" is a symbol of a string, such as might be useful with a person's ID.

> If you want to get hinkey, you can talk about '"1234"' as being
> a symbol for a symbol, but there is rarely much useful that can
> be done at such a level.

I agree that is not very useful. I disagree that all symbols are designated with quotation marks.

>

> > When a user puts that signifier for a number into an input field in a
> > form, someone has to make it clear (e.g. with a cast) that this
> > signifies a number.
>

> "Cast" is a poor choice of word, here, because it means something
> very specific in type theory, and very very few people consider
> integer to be a subtype of string.

an integer symbol is one of the types of symbols one can have with a string of characters. Most people have no problem seeing all data entry fields as strings that are converted, cast (whatever) to and from other types. There isn't anything so peculiar about such forms. When putting together a proposition, I'm pouring in strings of many different types into one larger string of type "proposition". Each component of this string is also a string, but can be interpreted as other types (sub-types) too.

I'll admit I haven't followed this line of thought all the way to see what it means in the theory of computer language development. I'm just saying it makes sense when working with data to view propositions and their components in this way. Then the UI isn't treated so differently than other places where we read and write modeled data.

> (I actually only know of one
> such person, but there could be others.) "Conversion" would be
> a term more people could agree with,

Yes, I didn't read ahead and used that word above. So, I'm good with that.

> and it also includes the
> idea of subtyping, so even that one person I know ought to be
> okay with it.

It almost sounds like you do know her.

>

> > > Symbols and the things symbolized
> > > do not in general participate in a subtype relationship.
> >
> > '1234' is a string. 1234 is a string that signifies the number
> > represented by 1234.
>

> So '1234' and 1234 are both strings? And 1234 represents a number,
> but is not a number? Is the number it represents also a string?
> How would you write just the number?

1234

You write a number by using a symbol for it, right?

>

> > > (David is
> > > not a subtype of 'David'.)
> >
> > Right. 'David' is a string. David is a person. I cannot put David,
> > the person, into the computer, so I model him with data including his
> > first name of 'David'.
>

> This paragraph gets it right.
>
>

> > I cannot put a number into a computer either,
> > so I model them with string values as well.
>

> How is it that you are not able to put numbers into a computer but
> you *can* put string values into one?

I can put symbols representing numbers into the computer and symbols representing names (for example) into a computer -- so, prehaps if I don't call them strings but "symbols" it will make more sense? The symbols happen to be strings of 0's and 1's or pop up from that to strings of ascii or unicode values. We could call all of these strings numbers if you want to define it that way (since they all look like numbers at some levels), but they are all the same -- symbols representing something else. They aren't symbols in the case of names and the actual thing in the case of numbers, in my opinion. They are at the same level for both -- symbols that represent parts of a proposition. --dawn  

>
> Marshall
Received on Mon Aug 15 2005 - 23:25:12 CEST

Original text of this message