Life's Chaotic: Let's Learn About Entropy
There are
some universal certainties. Death. Taxes. The second law of
thermodynamics. I'm sure you'll find plenty of articles online and off about
the first two things on that list, but this one is about the third. This law,
first identified by Sadi Carnot in 1824 in his musings of how the useful but
mysterious steam engines actually worked, stands tall and proud today, elevated
into an immutable fact of existence. No matter how hard you try, you simply
can't escape the iron grip of its cold, relentless conclusion: In isolated
systems, the entropy cannot decrease.
Count the molecules
If I gave you a box of air and asked you to measure its
properties, your first instinct would be to bust out a ruler and thermometer,
recording important scientific-sounding numbers like volume, temperature or
pressure. There's a good reason that's your first instinct. You know that
something like "air" is really a collection of buzzing and jumbling
microscopic molecules. You might imagine — briefly — that you could record all
the positions and velocities of every single molecule, but you would quickly
abandon such notions as overly cumbersome and also silly.
After all,
numbers like the temperatura, pressure and volume give you all the
information you actually care about. Those facts will tell you
everything you need to know about how the box of air will behave if you opened
it, squeezed it or blew it up. It doesn't really matter how all those tiny
particles arrange themselves — that's their business, not yours.
And that's exactly the point. There are so many different
ways of arranging the air molecules in your box that lead to the
exact same pressure, temperature and volume. Swap one particle with another —
would you even notice? Turn a few of the them around — did you even catch it?
No, you didn't. The pressure, temperature and volume could all remain
unchanged.
And here's where entropy comes in. The concept of entropy
captures the number of different ways you can rearrange the stuff you can't see
(the tiny air particles) to still get the exact same measurements you can see
(like the pressure).
Changing times
That's nice, but why should that number never, ever
decrease? To explore, let's clean your room.
Imagine finally getting it together, clearing your weekend schedule,
waking up early, grabbing that cup of joe, and doing the thing you've been
putting off since the holidays: cleaning your room. Top to bottom, wall to
wall. Clean and organized. A place for everything and everything in its place.
Perfection.
How long do you think that Platonic ideal of a room will
last? Not long, you realize, as the Sisyphean nature of your endeavors — and
perhaps your entire existence — clicks into sharp focus.
But why shouldn't your room stay neat and tidy for years to
come? Because if just one thing — one single thing — changes, it's no longer
clean. A dirty sock on the bed? Messy. A pillowcase ruffled? Messy. A box of
snack crackers on the nightstand? I won't judge, but also messy.
In this example, there are only select arrangements that
lead to the measurement of "clean," but there are millions upon
millions of possible arrangements that lead to the measurement of
"messy." If a tornado were to strike your newly cleaned room, what
are the chances of it remaining in the clean state? It's not zero: The tornado,
by pure random chance, could pick up every single thing in your room and return
it to its original spot. But that one tiny chance in the ultimate Powerball
ticket — and let's face it, you're not going to be the winner. You are much,
much more likely to find a messy room after the tornado strike, simply because
there are so many more ways for a room to be messy.
Entropy is a harsh
mistress
Similarly, there's absolutely nothing stopping the air
molecules in your room from collectively deciding to head in the same direction
and crowd into the corner, leaving you to asphyxiate in the vacuum. Seriously,
no law of particle or molecular interactions prevents that. But the motion of
air molecules is governed by countless random collisions and movements — a
never-ending molecular tornado of activity. These innumerable motions will
essentially always leave the air in a disorganized, messy state: evenly spread
all throughout the room. And all because there are so, so many more ways for the
air to be spread out than crammed into a corner.
Ultimately, systems never move from disorder to order
(unless there's a way to add energy to them, but that's a story for another
article) because of the sheer overwhelming statistics preventing it. The likelihood
of a disorderly state versus an orderly state isn't something like 10 to 1 or
3,720 to1, it's more like trillions upon trillions upon trillions (and throw a
few more trillions in there just for good measure) to 1.
As in the case of our once-tidy room, there are very few
ways to make a clean room, and an overwhelming number of ways to make a messy
one. And different arrangements of "messy" (like a dirty sock left on
the bed versus on the dresser) will lead to the same measurement of temperature
or pressure. Entropy captures the number of different ways you can rearrange
your messy room to get those same values. And systems, left to their own
chaotic devices, will always seek higher entropy, simply because there are a
lot more ways to make a mess than make a clean: the second law of
thermodynamics.
By Paul Sutter, Astrophysicist, 2019
Reproduced
from Space.com
https://www.space.com/43138-life-is-chaotic-entropy.html

No comments:
Post a Comment