Re: Entropy and Quantity of Information
Date: Sat, 12 Jan 2008 00:38:50 GMT
Message-ID: <u_Thj.2146$421.324_at_news-server.bigpond.net.au>
David Cressey wrote:
> Instead of hijacking another topic, I'll start this topic.
>
> David BL suggested that a way to quantify information is the number of bits
> needed to encode, allowing for compression.
I think the term being looked for is
http://en.wikipedia.org/wiki/Kolmogorov_complexity
There's been a _lot_ of work done in this area.
> I said I preferred entropy as
> the measure of information, and suggested that the two measures might in
> some way be equivalent. Someone else recalled the concept of entropy from
> the study of statistical mechanics and thermodynamics in physics.
It also gets a good work out in machine learning. Many of the classical algorithms (decision trees in their various forms) have information-gain (which is defined in terms of entropy) at their heart. (The precise definition is closely related to the number of bits required for representation)
[... snip interesting perspective ...]
> All of this goes back to the 1960s, and some of it to the 1940s. Is entropy
> still widely used in information science?
Yes.
> Is it relevant to database theory?
Good question.
Cheers,
Joe
Received on Sat Jan 12 2008 - 01:38:50 CET