shernren
you are not reading this.
- Feb 17, 2005
- 8,463
- 515
- 38
- Faith
- Protestant
- Marital Status
- In Relationship
As I described at length in the OP, their understanding of entropy, disorder and the 2nd Law would force them to conclude that crystals, eddies in water flow and boulders on top of rock columns are all forbidden by the Law. But such things (and millions of similar things) do exist, therefore their understanding must be flawed.
Ahh. I went back and refreshed myself with the OP and I see what you're saying. Thing is:
1. Crystals are actually low-entropy objects, relative to their surroundings. If you recall the discussion with metherion and me, entropy is roughly a measure of how much information we have vs how much information it would take to fully describe the system. Now if I say "Here's a crystal of salt", I'm actually giving you a lot of information (namely that there is regular ordering of the constituent ions), so that the amount of "unknown information" decreases by comparison, and hence the entropy.
2. A boulder on top of a rock column is also actually a low-entropy arrangement. I think I floundered on this when I last interacted with the thread but having spent a few months on partition functions I'm much better now
Imagine a plain with a sharp rock column; then imagine plotting the probability density of "where a massive rock can be" on this plain. The rock's potential energy varies according to its height; this causes an energy to be associated with each possible spatial position, which then feeds into the probability density via Boltzmann factors.
Now if I'm simply looking at the distribution of "where the rock can be", I'll find that it could be anywhere on the plain, with a very small probability of being on the rock column. But if I limit myself to "where on the rock column can the rock be" I will find that I have suddenly specified the probability distribution very severely. And that's why a video of a rock jumping onto a rock column is surely taken backwards!
Thing is, the Second Law specifies that entropy must increase (given suitable constraints); but it never says how fast entropy must increase. So the rock will eventually fall, but it might take a million years to do so.
3. Eddies, convection cells, and the like are actually nonequilibrium states. (Note: even stationary states can be non-equilibrium states!) The very concept of entropy becomes rather useless.
What happens is that over a non-equilibrium process, the system behaves chaotically (in the formal senses of chaos theory), and its phase space portrait becomes fractalized. Now the entropy is the integral of p Log(p) over phase space, where p is the probability density. Imagine trying to do that integral over the Mandelbrot pattern, or over a block of swiss cheese: no matter how you draw up your boundaries, you hit regions of phase space where the density (after some time) becomes zero - but the log of zero can't be defined sensibly!
So in fact, two of your examples are actually low-entropy systems, and the third doesn't quite have an entropy measure.
In classical thermodynamics (Im speaking of nonreacting systems here) one of the assumptions is that the number of particles and the temperature differential are both large enough that thats not a factor. Wouldnt the analogue for reacting systems be that if the difference between the critical energy and the available thermal energy is greater than Boltzmans constant, the reaction is disfavored?
I dont consider it snarky to assume that the other person is intelligent enough to follow a line of reasoning.
The snarkiness lies in the fact that the Fluctuation Theorems are not yet a part of common scientific parlance (among non-science people, at least) in the way that the SLoT and evolution are. It's therefore somewhat an argument from privilege.
I'm not sure what you mean by the "critical energy". The rule of thumb that I know of is that if the activation energy is less than 10 k_B T, a substantial amount of particles will be able to cross that activation barrier. (Whether or not the reaction then gets any further depends of course on the energetics of the whole reaction.)
Thing is, though, that assumes an equilibrium distribution of the reactants. But in small systems equilibrium is a precious commodity! Furthermore, living organisms are flagrantly not at thermodynamic equilibrium with their surroundings. This is (in my interpretation) principally because life consists of micromachinery which, because it is so small and so far from equilibrium, has a significant chance of decreasing local entropy.
For a good visual example of this sort of fluctuations, see videos of micromachinery such as this: YouTube - Microscopic cog occasionally the ratchet turns backwards, signifying a temporary decrease of entropy as the system reverts into a probabilistically-unlikely state.
Upvote
0