- Mar 14, 2023
- 1,424
- 551
- 69
- Country
- United States
- Faith
- Catholic
- Marital Status
- Private

What Is Entropy? A Measure of Just How Little We Really Know. | Quanta Magazine
Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of...

"Entropy" is a concept in the 2nd law of thermodynamics. Basically, the concept is that
"order" will deteriorate in the direction of "disorder", and "entropy is a measure of how
much disorder there is.
While this concept "works" in some scientific explanations (such as the even
dispersion of gas molecules, in a balloon), when people try to extend the property
to "information", then we are faced with the fact that we may be) ignorant of certain
types of information.
(You will run into this problem in the writing of the intelligent Design authors, who
specifically mention that the concept of "specified information" -- ordered things
that encode a message that makes sense to a human being -- is VERY different
than the mathematical formulas that determine how an initial ordering of things,
can change (probabilistically) over time.)
The intelligent Design authors (such as Michael Behe) make the point that a cell
creates order within the cell wall, and expells waste products (that are unordered).
In this way, a living cell continually increases the order within the cell wall.
Shannon's theory of information (in encryption), doesn't really represent the
concept of specified information, that the ID authors use.
However, Jaynes asserted that "entropy" is in the eye of the beholder. It depends
on how much the observer can distinguish one thing from another thing. What may
look like one thing, may appear to have reached a state of disorder. But, if the
observer can distinguish a number of different things in the "one thing", then
it may be that there is encoded order in the thing.
Gibbs asserts (with a paradox), that "entropy" is a characteristic of the observer,
and not a fixed characteristic of some microstate.
There is a connection between how much information a researcher has about
a system, and how much useful work a researcher can get out of a system.
Research has been done on "information engines". These are experiments that
use certain information, to run physical machines (and do work). While in a rudimentary
form, these engines demonstrate that information CAN be used to run physical
machines, and do useful work.
In the case of subatomic particles, there is a "menagerie of entropies to choose from".
The new work on "entropy" is to find "useful order" that can be used, to do
useful work.