15-08-2013, 09:41 PM
Albert Rossi Wrote:Charles Drago Wrote:Albert,
Would I be correct, then, if I rewrote "information content is inversely proportional to probability" thusly:
"Information content is inversely proportional to randomness."
No, actually it is the opposite. The greater the randomness, the greater the entropy, which is a measure of information.
The problem here is that Shannon and Information Theory use the concept of information in a rather counter-intuitive way. Common usage of the term would suggest that random = meaningless. But "information" is not the same as "meaning" in the context of this theory. I sometimes fall into the same trap and have to correct myself. Information theory was invented to model systems of communication, and is not a theory of semantics.
Another way of thinking about that difference is the idea of "structure" vs "information". Semantic content seems to be closely associated with structure, but structure is somewhat different from the notion of information content in Shannon's sense. A really interesting read, if you are inclined to read such things, which deals with this interaction, and with the notion of "complexity" (with a special view to the problem of the evolution of complexity from very simple processes) is by the Nobel Prize winning physicist:
Murray Gell-Mann, The Quark and the Jaguar: Adventures in the Simple and the Complex. (see http://www.amazon.com/The-Quark-Jaguar-A...0805072535 -- not a suggestion to buy it there, though).

