Re: Pizza Example

From: Laconic2 <laconic2_at_comcast.net>
Date: Fri, 23 Apr 2004 14:34:32 -0400
Message-ID: <EdidnSg1ZfcZ_BTdRVn-vA_at_comcast.com>


> Does entropy apply to information?

Yes, but in a different way than what's being discussed here.

In the 1940s, Claude Shannon devloped a theory of information based around the amount of "uncertainty" in a system. The formulas for uncertainty resemble the formulas for thermodynamic entropy so much that many people have called uncertainty by the name "entropy".

The uncertainty in a system can be lowered by the reception of a message. The difference in uncertainty before and after the message can be used as a measure of the amount of information in the message. with an appropriate choice of units, you can measure the information in "bits". If you know how many bits of data you have, and how many bits of information is conveyed by that data, you have some measure of the efficiency.

Of course, uncertainty in a system is subject to context, meaning "uncertainty with regard to what?" If you are uncertain about whether the check you are about to write will bounce, the bank statement probably has more information than the weather reprot.

All of this is terribly oversimplified, but Shannon's work was seminal. Received on Fri Apr 23 2004 - 20:34:32 CEST

Original text of this message