The entropy equation: decoding phase transitions in nature

academics entropy phase transitions physics
By Sergio

Suppose all of scientific knowledge has suddenly evaporated and you were tasked to pass along a single sentence to the next generation of scientists. What would you say? This scenario was posed to Nobel laureate and physicist Richard Feynman. His answer was the atomic hypothesis: everything is made of atoms. The ice you skate on, the water you drink, the steam that powers your home, all atoms.   

But wait…you’re telling me that all of these extremely different materials are made of the exact same molecule, H2O? Why yes, of course, we all learn about the states of matter and phase transitions in elementary school, this is not very shocking. However, one never speaks of melting an H2O molecule or of a gas of a single atom. Clearly, we need large collections of these to give rise to new and emergent phenomena such as the states of matter. A collection of atoms is more than the sum of its parts! 

As we all know, large collections of atoms can undergo phase transitions. This feels familiar and may seem almost trivial, but describing the physics of phase transitions has taken major scientific developments. One of these, introduced by Ludwig Boltzmann in 1865, is entropy. 

The Second Law 

Physicists like to say that there is no law more sacred than the second law of thermodynamics: the entropy of a closed system always increases. But what does it mean? We’ll work through this slowly, but we have to define some terms first. 

If your friend asks you for the time, you’re likely to round to the nearest quarter or five-minute interval instead of replying with the accuracy of an Olympic stopwatch. For instance, your mind may automatically assign the times of 12:14, 12:15, 12:16 to the phrase “quarter past twelve”. This is because, for most cases, a general idea of the current time is simply more useful than knowledge of the exact microsecond. This process of blurring out superfluous details is called coarse-graining. Physicists call the coarse-grained time bins you decide to use (e.g. 12:15, 12:30) macrostates and the exact times that go into each bin microstates (e.g. 12:01:24 and 12:02:10 both go into the 12 o’clock bin). Let’s take these concepts a bit further: suppose you have a box of 10,000 coins that you shake vigorously before letting out one coin at a time and writing down the sequence of heads or tails. In this case, each exact sequence of H’s and T’s constitutes a microstate while some coarse-grained quantity, such as the total number of heads, forms a macrostate. Entropy is simply a measure of how many full-detail microstates belong to a coarse-grained macrostate. More precisely, 

Screen Shot 2024-10-10 at 7.39.30 PM

Even though each sequence of heads and tails is equally likely - they each have a probability of 1/210,000 in popping up - there is only one sequence that produces exactly 10,000 heads. On the other hand, there are more ways to get 5,000 heads than there are atoms in the entire observable universe! It’s not very surprising then that after shaking the box, our system favors the most likely macrostates: those with the most number microstates (i.e. the most entropy). Much like our coins, the atoms in a room are constantly being jiggled around due to thermal fluctuations and there are many more ways to arrange the particles such that they uniformly take up the entire volume of the room versus being concentrated in some corner. Indeed, smoke in a room will naturally diffuse and take up as much volume as possible. Thus, the statement of the second law of thermodynamics is simply the observation that physical systems tend to evolve towards their most likely state! This is why entropy is sometimes considered a measure of disorder; maximum entropy states allow for the maximum number of rearrangements of microscopic constituents.  

Screen Shot 2024-10-10 at 7.40.37 PM

Phase Transitions 

Screen Shot 2024-10-10 at 7.41.26 PM

f entropy always increases, then why do we have states of matter at all? Shouldn’t a block of ice spontaneously release its molecules into the surrounding air to increase the entropy of the universe, regardless of the temperature?  

The short answer is that we have forgotten to take the conservation of energy into account. If the room is isolated, then the ice + room system is closed, and its total energy must be conserved. It takes energy to expel molecules from the surface of the ice, so the air in the room must give up some of its own energy to vaporize the ice cube, drastically decreasing the entropy of the air in the room. Thus, for the total entropy of the ice + environment to increase, the increase in entropy of the ice turning into vapor must outweigh the decrease in entropy of the surroundings as it heats up the cube. This competition between the energy and entropy of the system (ice) and the environment (room) is key in how physicists study phase transitions. To quantify this competition, physicists study a quantity known as the free energy. 

Screen Shot 2024-10-10 at 7.42.19 PM

A key advantage of the free energy is that it encapsulates all of the details about the surroundings we do not care about into a single quantity, the temperature. We also know that the system + surroundings must maximize their collective entropy. It turns out that this is accomplished when the free energy is minimized. We thus see that when temperature is low, the free energy is minimized when the internal energy of the system is minimized. On the other hand, at large temperatures, the entropy term dominates, and the system prefers disordered states such as a gaseous state. 

Conclusion 

I hope you can now appreciate that the physics of phase transitions is a perhaps unexpectedly deep and exciting topic. Its ideas have had profound applications in quantum physics, materials engineering, and even the origins of our own universe. Next time you toss a coin or melt an ice cube, just remember the dance between energy and entropy that makes the world an exciting place to be.  

Sergio holds a BS in Physics and Math from Emory University, where he graduated magna cum laude. He is currently a PhD student in Physics at MIT.

Comments

topicTopics
academics study skills medical school admissions MCAT SAT college admissions expository writing strategy English MD/PhD admissions writing LSAT physics GMAT GRE chemistry academic advice graduate admissions biology math interview prep law school admissions ACT language learning test anxiety personal statements premed career advice MBA admissions AP exams homework help test prep creative writing MD computer science mathematics study schedules Common Application history summer activities secondary applications research philosophy organic chemistry economics supplements admissions coaching 1L dental admissions grammar statistics & probability PSAT psychology law legal studies ESL reading comprehension CARS PhD admissions SSAT covid-19 logic games calculus engineering USMLE medical school mentorship Latin Spanish biochemistry parents AMCAS admissions advice case coaching verbal reasoning DAT English literature STEM excel genetics political science skills French Linguistics MBA coursework Tutoring Approaches academic integrity astrophysics chinese classics dental school freewriting gap year letters of recommendation mechanical engineering technical interviews units Anki DO Social Advocacy algebra amino acids art history artificial intelligence business careers cell biology cold emails data science diversity statement first generation student geometry graphing kinematics linear algebra mental health pre-dental presentations quantitative reasoning software engineering study abroad tech industry time management work and activities writer's block 2L AAMC DMD IB exams ISEE MD/PhD programs MMI Sentence Correction adjusting to college algorithms analysis essay argumentative writing athletics business skills executive function fellowships finance functions genomics infinite information sessions international students internships logic networking office hours poetry proofs resume revising scholarships science social sciences trigonometry 3L Academic Interest ChatGPT EMT FlexMed Fourier Series Greek Health Professional Shortage Area Italian JD/MBA admissions Japanese Lagrange multipliers London MD vs PhD Montessori National Health Service Corps Pythagorean Theorem Python Shakespeare Step 2 TMDSAS Taylor Series Truss Analysis Zoom acids and bases active learning architecture art art and design schools art portfolios bacteriology bibliographies biomedicine boarding school brain teaser burnout campus visits cantonese capacitors capital markets central limit theorem centrifugal force chem/phys chemical engineering chess chromatography class participation climate change clinical experience community service constitutional law consulting cover letters creative nonfiction curriculum dementia demonstrated interest dimensional analysis distance learning econometrics electric engineering electricity and magnetism embryology entropy escape velocity evolution extracurriculars fundraising harmonics health policy history of medicine history of science hybrid vehicles hydrophobic effect ideal gas law immunology induction infinite series institutional actions integrated reasoning intermolecular forces intern investing investment banking