Re: Temporal database - no end date
Date: 22 Jan 2007 18:15:55 -0800
Message-ID: <1169518555.078376.53810_at_l53g2000cwa.googlegroups.com>
Marshall wrote:
> On Jan 22, 9:39 am, "DBMS_Plumber" <paul_geoffrey_br..._at_yahoo.com>
> wrote:
> > Volker Hetzer wrote:
> > > I'm not really sure what's wrong with it. All it shows is that you have
> > > chosen the wrong granularity for the job. Your "pick any" sentence simply
> > > states that for any real world scenario you can choose the wrong granularity
> > > if you insist.
> >
> > As opposed to a model where you don't have to pick a granularity at
> > all, and where there is not even the logical possibility of this
> > situation occuring.
>
> If you've explained how to do that, I've missed it. How can you
> not pick a granularity? The pigeonhole principle would seem to
> indicate that you must. I assume we're still talking about software
> on digital computers here.
By granularity do you mean unit of time? Note that without chronon quantization and assuming floating point number representation of times, the choice of unit of time has no affect on the level of quantization because the "relative scale" of the unit can be absorbed into the exponent. Rather the quantization only depends on the size of the mantissa.
On another note I think you are continually pushing the conversation into physical concerns when the subject of the thread is very clearly to do with the underlying logical model.
What do you think floating point number representations are trying to model? Themselves as a pattern of bits? How do you explain why 1.0 / 3.0 on a real computer comes out remarkably close to one third, or repeating the calculation x = (x*x+2)/(2*x) quickly converges to something remarkably close to sqrt(2). Received on Tue Jan 23 2007 - 03:15:55 CET