essentialsaltes
Stranger in a Strange Land
- Oct 17, 2011
- 33,243
- 36,556
- Country
- United States
- Faith
- Atheist
- Marital Status
- Legal Union (Other)
For example, what is Claude Shannon information?
Loosely, Shannon information is how many 'bits' it takes to describe a 'message' or sequence. There is some neat correspondence in the mathematical formulation to the physical idea of entropy.
Another way to look at it is how hard is it to predict the next letter in the sequence. Or the amount of 'surprise' in each new piece of 'information'.
If you were making a 'message' of random coin flips, well, if they are really random, there is no way to predict the next letter of the sequence, so the string has maximum information. There is no way to 'condense' the message.
You know that for image files, they are often compressed to make the files smaller. An image with a lot of information is hard to compress (at least without losing information).
A looser analogy... if we know the message is in English, and we had the string
Bob looked out the window and saw a q
We would be relatively unsurprised by a 'u' showing up next. An 'a' might send us into qabbalist on rare occasions. And there even a slight chance of a qwerty typewriter or a q*bert console being outside Bob's window. But 99%+ of the time it'll be a 'u'.
Whereas...
Bob looked out the window and saw a m
there are a lot more viable options for the next choice. There's no way we'd bet the farm on 'u' like we would for the q case. This could be formalized for an entire message using known letter frequencies in English to determine the overall information content of a string in English.
Creationists typically don't like to talk about Shannon information (despite its utility and strict mathematical definition) because any analogical application to genetics shows that evolution increases Shannon information. Every mutation in a gene pool is a little surprise.
So instead your question tends to get either hand-waving or silence.
Upvote
0