Re: what are keys and surrogates?

From: David BL <davidbl_at_iinet.net.au>
Date: Thu, 10 Jan 2008 18:13:58 -0800 (PST)
Message-ID: <61ea6075-3ef6-4db9-8e6c-0d2f0c7ec013_at_q77g2000hsh.googlegroups.com>


On Jan 10, 4:02 pm, Marshall <marshall.spi..._at_gmail.com> wrote:
> On Jan 9, 9:07 pm, David BL <davi..._at_iinet.net.au> wrote:
>
> > On Jan 10, 11:44 am, Marshall <marshall.spi..._at_gmail.com> wrote:
> > > On Jan 9, 5:26 pm, David BL <davi..._at_iinet.net.au> wrote:
>
> > > > > I'd just say, a la SML:
>
> > > > > product(sum(5, 6), x)
>
> > > > So is the name of a relation part of its value?
>
> > > No. The "product" and "sum" above are what are usually called
> > > "constructors." The above is a value. You can put it in a relation
> > > if you want. (Probably I introduced the confusion by talking about
> > > union types immediately after talking about RVAs.)
>
> > Is "constructor" the same as what C. Date calls a "selector"?
>
> Yes. Date calls it a selector, and the entire rest of the world
> calls it a constructor. :-)

I think selection (ie specification) of a value is a different concept to construction of an object (or variable, if you prefer). It seems that Date wants to emphasise the distinction.

> > > > If "relational" encompasses too wide a spectrum then the word starts
> > > > to become rather meaningless. It's not meant to have any bearing on
> > > > how one solves problems - I'm not suggesting that RVAs are evil in any
> > > > sense!
>
> > > Okay. But let me be explicit: in my mind the "ideal" programming
> > > language encompasses different "pure" computing models into
> > > a single hybrid computing model. That model will necessarily
> > > include relations, (because they are so awesome) and it will
> > > also necessarily include other things, such as first class functions,
> > > because they too are awesome.
>
> > I completely agree.
>
> Cool.
>
> An interesting note, by the way: functions are relations...

Isn't it more precise to say that the graph of a function is a relation?

> > > Well, my own thoughts in this area are evolving and may
> > > sound pretty weird at this point. Roughly: there are two "kinds"
> > > of values: relations and atoms. The type of a relation is
> > > exclusively the set of its attribute names. Atoms don't have
> > > types, however we may specify a set of atoms in such a
> > > way as looks very much like what is usually called a type.
> > > (Thus: we have the set of natural numbers, and we have
> > > the set of integers, and N is a subset of Z. And we can
> > > tell of an atom whether it belong to N, etc.) In addition
> > > we have constraints, and the constraint language is
> > > Turing complete (and hence not decidable.) We can
> > > specify a constraint on a variable, in which case it is
> > > a runtime construct, or we may specify a constraint
> > > on immutable constructs, in which case they should
> > > be checked at compile time. Under this scheme much
> > > of the machinery usually associated with types gets
> > > taken over by constraints.
>
> > How do operators fit into that picture - given that a type is
> > conventionally a set together with the operators available on that
> > set?
>
> The picture includes functions. Functions have in and out
> parameters. There are no methods.

Have you considered the idea that a function maps an input tuple to an output tuple? So

    f x

represents a tuple which is the application of function f to tuple x, and

    f (1,2)

is seen as consistent with that notation where (1,2) is a constructor/ selector for a tuple value.

I have wondered whether this combined with some operators to provide direct support for tuple manipulation (such as flattening a tuple within a tuple) could form the basis for a hybrid language that encompasses relational, functional and imperative "computing models". Not my area of expertise though!

> (I also support var parameters and closures, but they
> don't qualify as true functions. I'm still not sure what
> limitations they should have. Clearly we don't want
> back doors to the constraint system via mutable relation
> values, to list just one of the many ways to screw up.)
>
> I have no strong feelings about encapsulated ADTs; what
> Date calls ... uh. Shit. I can't remember what he calls them.
> I don't entirely see the reason for them. Performance I guess?
>
> I don't see a big reason for a close tying of functions to types.
> CLOS seems like an interesting idea. Maybe it's a bit *too*
> configurable, though.
>
> I think it would be neat if you could, for example, have
> declarations like
>
> +:(x:num, y:num -> z:num)
> +:(x:int, y:int -> z:int)

I see that as part and parcel of the important idea of compile time algorithm selection available through overloading on type.

It also reminds me of an interesting point. Consider that int64, int32, int16 are various rings for modulo arithmetic. Would you say that according to the concept of value-type inheritance espoused by Date, we would want int16 to be a subtype of int32 which is a subtype of int64? This is possible if for example int32 inherits the int64 + operator (which can yield an int64 result) as well as defining its own independent modulo 2^32 version of a + operator.

ie the following two operators are distinct

   + : (x:int64, y:int64 -> z:int64)
   + : (x:int32, y:int32 -> z:int32)

A function using the first + operator that accepts int64 will work correctly when passed int32 values - so it's reasonable to support implicit conversion of int32 values to int64 values.

> and have the system know from this that it can add any
> two numbers, but also that it can (efficiently) add two
> integers, and that when you add two integers you always
> get an integer.

I wonder how modulo arithmetic fits into that picture (where the + operators are logically distinct).

> > > This is just my own particular way of thinking; it doesn't
> > > represent anything more than that.
>
> > > > > > Would there be some concept of
> > > > > > inheritance for relation types (by analogy to an OO implementation of
> > > > > > an AST that uses a Node base class)?
>
> > > > > I'm inclined to say NOOOOOOO!!!!!</vader>
>
> > > > I wonder then how all the relevant constraints would be specified.
>
> > > Given a Turing complete constraint language, any constraint
> > > that can be computed can be expressed. I don't think you
> > > can do any better than that. :-) Of course, this approach brings
> > > its own issues ...
>
> > Like needing Ctrl+C to abort the compiler!
>
> Well, yeah, that's the danger. There are conservative ways
> to handle that sort of thing, though. The compiler can be smart
> enough to know when a constraint isn't within the bounds that
> it can prove it can decide. And it can have resource limits.
>
> It's possible to consider some constraints as proof obligations.
> And the compiler can try to satisfy that obligation and if it
> can't do so within a fixed period of time, that can be an
> error. And there is also the possibility of the (sophisticated)
> programmer supplying the proof, as an adjunct to the
> source code.
>
> The papers I've read suggest that the distinct majority of
> proof obligations can be met automatically.
Received on Fri Jan 11 2008 - 03:13:58 CET

Original text of this message