# Re: Computational complexity (was Re: Normalisation)

From: Marshall Spight <marshall.spight_at_gmail.com>
Date: 18 Jul 2005 23:16:53 -0700

Misha Dorman wrote:
> Jon Heggland wrote:
> >>>I guess not. :) I can estimate the memory use of my algorithms, but I
> >>>haven't felt the need for turing tape before now.
> >>
> > Estimating complexity on Turing machines is one thing; doing it for
> > usable programming languages / systems is slightly different. I know
> > theory, but not *all* theory. Sorry, but we can't all be omniscient.
>
> The programming language (and the constant space/time assumptions for
> "int") are an abstraction -- and (like all abstractions) are leaky[1].
>
> Will you need an 8-bit, 16-bit, 32-bit or 64-bit counter (and pointers,
> etc)? They take up different "constant" amounts of space, you know

Well oh my GOD you could have knocked me over with a feather when I read this. And here all along I have been asuming that 8, 16, 32 and 64 bit counters all took the SAME amount of space. I am going to have to go back over a BUNCH of my code and make sure I don't have some hidden bugs as a result of learning this.

I'm going to go with sophistry. Everyone raise your hand who's seen a single string value longer than 2^32. Can we have the house lights up for a moment? [scans audience] Didn't think so.

The *reality* of the situation is that 32 bits is good enough for about 99% of everything, and 64 bits is good enough for the rest.

Let's say you have something going on that you want to count. Let's say it happens quite often: one million times a second. There's not much of anything that happens that fast that you can actually work with on today's hardware, but let's just say. Let's say you use a 64 bit counter to count these events. Now your boss wants to know the space requirements for this counter. Do you tell him:
1) O(1), or
2) O(n), because the counter is going to overflow in HALF A MILLION YEARS. (Brought to you by today's clever use of Google:) http://www.google.com/search?q=2%5E64+microseconds+in+years

Anyone here who can write a significant application that will run continuously for half a million years, I will give a cookie. Pretty sure *I* can't, despite having spent the last 25 years doing little else except write applications, and being considered fairly good at it.

Marshall Received on Tue Jul 19 2005 - 08:16:53 CEST

Original text of this message