Suppose all of scientific knowledge has suddenly evaporated and you were tasked to pass along a single sentence to the next generation of scientists. What would you say? This scenario was posed to Nobel laureate and physicist Richard Feynman. His answer was the atomic hypothesis: everything is made of atoms. The ice you skate on, the water you drink, the steam that powers your home, all atoms.
But wait…you’re telling me that all of these extremely different materials are made of the exact same molecule, H2O? Why yes, of course, we all learn about the states of matter and phase transitions in elementary school, this is not very shocking. However, one never speaks of melting an H2O molecule or of a gas of a single atom. Clearly, we need large collections of these to give rise to new and emergent phenomena such as the states of matter. A collection of atoms is more than the sum of its parts!
As we all know, large collections of atoms can undergo phase transitions. This feels familiar and may seem almost trivial, but describing the physics of phase transitions has taken major scientific developments. One of these, introduced by Ludwig Boltzmann in 1865, is entropy.
The Second Law
Physicists like to say that there is no law more sacred than the second law of thermodynamics: the entropy of a closed system always increases. But what does it mean? We’ll work through this slowly, but we have to define some terms first.
If your friend asks you for the time, you’re likely to round to the nearest quarter or five-minute interval instead of replying with the accuracy of an Olympic stopwatch. For instance, your mind may automatically assign the times of 12:14, 12:15, 12:16 to the phrase “quarter past twelve”. This is because, for most cases, a general idea of the current time is simply more useful than knowledge of the exact microsecond. This process of blurring out superfluous details is called coarse-graining. Physicists call the coarse-grained time bins you decide to use (e.g. 12:15, 12:30) macrostates and the exact times that go into each bin microstates (e.g. 12:01:24 and 12:02:10 both go into the 12 o’clock bin). Let’s take these concepts a bit further: suppose you have a box of 10,000 coins that you shake vigorously before letting out one coin at a time and writing down the sequence of heads or tails. In this case, each exact sequence of H’s and T’s constitutes a microstate while some coarse-grained quantity, such as the total number of heads, forms a macrostate. Entropy is simply a measure of how many full-detail microstates belong to a coarse-grained macrostate. More precisely,
Even though each sequence of heads and tails is equally likely - they each have a probability of 1/210,000 in popping up - there is only one sequence that produces exactly 10,000 heads. On the other hand, there are more ways to get 5,000 heads than there are atoms in the entire observable universe! It’s not very surprising then that after shaking the box, our system favors the most likely macrostates: those with the most number microstates (i.e. the most entropy). Much like our coins, the atoms in a room are constantly being jiggled around due to thermal fluctuations and there are many more ways to arrange the particles such that they uniformly take up the entire volume of the room versus being concentrated in some corner. Indeed, smoke in a room will naturally diffuse and take up as much volume as possible. Thus, the statement of the second law of thermodynamics is simply the observation that physical systems tend to evolve towards their most likely state! This is why entropy is sometimes considered a measure of disorder; maximum entropy states allow for the maximum number of rearrangements of microscopic constituents.
Phase Transitions
f entropy always increases, then why do we have states of matter at all? Shouldn’t a block of ice spontaneously release its molecules into the surrounding air to increase the entropy of the universe, regardless of the temperature?
The short answer is that we have forgotten to take the conservation of energy into account. If the room is isolated, then the ice + room system is closed, and its total energy must be conserved. It takes energy to expel molecules from the surface of the ice, so the air in the room must give up some of its own energy to vaporize the ice cube, drastically decreasing the entropy of the air in the room. Thus, for the total entropy of the ice + environment to increase, the increase in entropy of the ice turning into vapor must outweigh the decrease in entropy of the surroundings as it heats up the cube. This competition between the energy and entropy of the system (ice) and the environment (room) is key in how physicists study phase transitions. To quantify this competition, physicists study a quantity known as the free energy.
A key advantage of the free energy is that it encapsulates all of the details about the surroundings we do not care about into a single quantity, the temperature. We also know that the system + surroundings must maximize their collective entropy. It turns out that this is accomplished when the free energy is minimized. We thus see that when temperature is low, the free energy is minimized when the internal energy of the system is minimized. On the other hand, at large temperatures, the entropy term dominates, and the system prefers disordered states such as a gaseous state.
Conclusion
I hope you can now appreciate that the physics of phase transitions is a perhaps unexpectedly deep and exciting topic. Its ideas have had profound applications in quantum physics, materials engineering, and even the origins of our own universe. Next time you toss a coin or melt an ice cube, just remember the dance between energy and entropy that makes the world an exciting place to be.
Comments