15-08-2013, 09:10 PM
Charles Drago Wrote:Albert,
Would I be correct, then, if I rewrote "information content is inversely proportional to probability" thusly:
"Information content is inversely proportional to randomness."
No, actually it is the opposite. The greater the randomness, the greater the entropy, which is a measure of information.
The problem here is that Shannon and Information Theory use the concept of information in a rather counter-intuitive way. Common usage of the term would suggest that random = meaningless. But "information" is not the same as "meaning" in the context of this theory. I sometimes fall into the same trap and have to correct myself. Information theory was invented to model systems of communication, and is not a theory of semantics.

