Re: hypothesis: always computer bugs!

From: Rick Morris <morriscode_at_telocity.com>
Date: Fri, 18 Jan 2002 14:50:33 -0500
Message-ID: <3c4878f4$1_1_at_nopics.sjc>


Isn't this all a little OT? This is a database theory forum, not a software design forum.

Anyway, I can't agree completely. As your other poster mentioned "what's the bug in Hello World?".

It's just like any other construct, be it mechanical or logical : The chance of a flaw increases geometrically with the complexity of the construct. But it's still a probability thing, not an absolute. To say programs are _never_ error-free is kind of like saying "There's a 50/50 chance of a penny coming up heads or tails when I flip it, so if I flip it 10 times, I will get 5 heads and 5 tails". Since this is only a probability thing, you actually might flip a penny 10 times and get 'tails' every time. Of course, the more times you flip it, the more likely you are to get an even split, but it is still possible to flip a penny 1000 times and have it come up tails every time.

In other words; inversely related to the amount of code, and directly related to how carefully you plan, and how carefully you review code, your probability of being error-free increases. This is the odds being capitalized on by those who develop software for the Space Shuttle, for example.

That being said, we all know that in the real world, There has probably never been a complex program that is truly error-free, but with some, the errors are few enough to be negligible. I have also heard that there hasn't been a bug discovered in years with the TEX typesetting system by Donald Knuth. ( http://www-cs-faculty.stanford.edu/~knuth/ )

Here's a good article to read:
http://www.linuxjournal.com/article.php?sid=5069 See the part about CMM (Capability Maturity Model).

"ddcone1a" <ddcone_at_worldnet.att.net> wrote in message news:HSQ18.379555$W8.13959038_at_bgtnsc04-news.ops.worldnet.att.net...
> Hypothesis:
>
> Software and computer programs are never
> error free. There are unexpected, minute variations that
> occur over time that can degrade a computer program's expected results.
> Digital logic may appear to predominate, but
> fuzzy logic is the norm.
> Please express your opinion to accept or
> refute this hypothesis.
>
> TIA, DD
>
>
>
Received on Fri Jan 18 2002 - 20:50:33 CET

Original text of this message