Oracle FAQ | Your Portal to the Oracle Knowledge Grid |
Home -> Community -> Usenet -> comp.databases.theory -> Re: What databases have taught me
Bob Badour wrote:
> Keith H Duggar wrote:
> > Bob Badour wrote:
> > > I am speaking of data type as a concept. The integer
> > > data type has an unlimited number of operations
> > > defined on it. In most contexts, only a tiny subset
> > > of them are in scope.
> >
> > What I don't get is why said unlimited number of
> > operations defined /on/ integers are the definition /of/
> > integers.
>
> You are not being clear. You are using integers to mean
> both a set and a data type without specifying which you
> mean.
>
> Without any operations, the set is just a bunch of
> symbols. To manipulate those symbols, one must have
> operations.
I never said their are not "any operations". I said there need not be "an unlimited number of operations" to define a data type and do useful work.
> > You and I can communicate using integers, prove theorems
> > about integers, think about integers, etc by appealing
> > only to a very small subset of those unlimited
> > operations. So of what relevance are the remaining
> > unmentioned or undiscovered operations to our discussion
> > and thinking?
>
> The relevance is they are there for our use any time we
> want them. Without any operations, we cannot do anything
> meaningful with the set of integers. We cannot prove
> theorems about them at all.
Again I never said there are not "any operations"
> It is only after we formally specify which operations we
> are using that we can do anything useful.
But what I am saying is we define a data type by defining a /small/ number of operations. From then on we can use these /defining/ operations to derive an arbitrary number of other operations without changing the definition of the data type.
> The word 'data' implies representation suitable for
> machine manipulation. Without operations, symbols are
> useless.
Sure. However, symbols are very useful with only a small number of operations. Witness lambda calculus with only two operations.
> > And suppose we define integers in the usual set
> > theoretic or algebraic ways. Why can we not treat other
> > operations as simply derived or auxiliary?
>
> All operations can be derived or auxiliary. If we removed
> all the derivable operations, we would have nothing left
> from which to derive the rest.
That is a contradiction. "All operations can be derived" implies that for any operation O there exists a set of operations S which is the union of operations from which O is derivable. Once S is empty then O is no longer derivable and cannot be removed. In other words, of course there is always a non-empty set of axioms from which you derive.
> What is important is not which operations we consider
> fundamental and which we consider auxiliary. What is
> important is all of those operations exist and we can
> communicate them to each other.
Not important? Well that is a large part of mathematics, finding sets of operations (the smaller the better) we consider fundamental and from which we can derive other operations.
> What's more, the set of operations in the algebra is a
> proper subset of the operations for the data type. For
> example, substring is an operation defined for integers
> because it has two integer parameters.
Here is an algebraic definition for an ordinal type:
Ordinal
Exists set N
zero : -> N
succ : N -> N
For all x element of N
succ(x) != zero succ(x) != x succ(x) = succ(y) implies x = y
Why are not the 'zero' and 'succ' operations together with the above axioms sufficient to /define/ ordinals? Why do I need to think about "substring" or arbitrarily many other operations as /defining/ ordinals?