Re: Conditional Tables
Date: Tue, 12 Oct 2004 15:41:38 -0400
Message-ID: <hPSdnT0eJrkfrvHcRVn-rQ_at_comcast.com>
Paul <paul_at_test.com> wrote in message
news:<41590f52$0$69731$ed2619ec_at_ptn-nntp-reader01.plus.net>...
> Yes, what I mean is that the external truth is that employee 123 is
> definitely either called John, or is called Bill. It's just we don't
> know which at the moment.
>
> In other words, of the following two statements, one and only one is
true:
>
> "Employee 123 is called John"
> "Employee 123 is called Bill"
>
> But the fact that we don't know exactly which one doesn't make this a
> useless piece of information.
It gets even more interesting if you have probabilities
Case 1: with probability 0.50 Employee 123 is called John
with probability 0.50 Employee 123 is called Bill
Case 2: with probability 0.70 Employee 123 is called John
with probability 0.30 Employee 123 is called Bill
We actually have more information in case 2 than in case 1. Claude Shannon's pioneering work on information theory allows us to put a quantitative measure on the difference of information in the two cases. Basically, it involves computing the entropy in the two cases and subtracting.
AFAIK Information theory still uses Shannon's measure of information. Received on Tue Oct 12 2004 - 21:41:38 CEST