You still don't get it. Information is just the uncertainty of a message you haven't yet seen. Let's go over the basics:
1. if you are absolutely sure what an incoming message will say, the information in it is 0. You learn nothing from the message.
2. If you are somewhat uncertain as to what the message will be, the message will be a number between 1.0 and 0, but will not include 1.0 and 0.
In
information theory, the
entropy of a
random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes.
Where H is information and there are x alleles in the population.
Does that mean that random things like scrabble letters tossed on a surface have information? Yep. In fact, they have a great amount of information, more than a proper sentence would have. You might have to think about that for a bit to realize why. But if you can't see it, then I'll explain. Let me know.
So a new allele could be the result of a point mutation, and it might do something or it might no. As I said, you have about 100 mutations that were present in neither of your parents. Because proteins are so large, one amino acid switched out usually don't do anything measurable. But it's still new information and it increases the information in the population genome.
The origin is in mutations. That was Mendel's discovery.
Right. That was Darwin's great discovery. Random mutation and natural selection. It's been directly observed to work. Would you like to learn about that?
You might think so, but of course, you aren't Jesus. Remember, He created things to work this way. So I'm pretty sure He wouldn't agree with you.