Entropy

MHK
13 min readJul 7, 2023

Energy, that shapes our universe, often goes unnoticed in our daily lives. We are aware that the Earth receives energy from the sun, but have you ever paused to ponder the precise nature of this celestial gift? In this captivating blog post, we embark on a journey into the enigmatic world of energy. We will explore its implications, and its role in our lives, and uncover the fascinating story behind one of the most influential figures in energy research.

The Sun’s Radiance: A Vital Contribution to Earth

As the Earth basks in the sun’s radiant light, we revel in its warmth and reap the benefits of vitamin D from its ultraviolet rays. Undoubtedly, energy from the sun plays a vital role in sustaining our planet. However, have you ever wondered about the intricate balance between the energy the Earth receives from the sun and the energy it radiates back into space? Prepare to be surprised by the answers.

The Ebb and Flow of Energy: Transformation and Transfer

The Earth indeed receives a significant amount of energy from the sun on a daily basis. However, when it comes to the energy radiated back into space, uncertainty seems to prevail. Opinions vary, with some suggesting less energy, others considering a fraction, and a few daring to venture a specific percentage. The truth lies in the very essence of energy itself — it cannot simply be used up or disappear. Instead, it undergoes transformation and transfer, continuously shaping the world around us.

Sadi Carnot: Unveiling the Secrets of Energy Transfer

To truly grasp the concept of energy, we must journey back to the winter of 1813 in France — a time of invasion and turmoil. It was during this period that a young man named Sadi Carnot, son of one of Napoleon’s generals discovered a passion for physics. This newfound passion led him to explore the most efficient ways to transfer energy in mechanical systems. Carnot’s curiosity was particularly piqued by the emergence of steam engines — a revolutionary technology of the era. These engines stood at the forefront of industrial and military progress, with nations vying for supremacy in their designs. The French steam engines, however, lagged behind their British counterparts, igniting Carnot’s mission to unravel the secrets of energy transfer.

Steam Engine

The Birth of an Ideal: Carnot’s Revolutionary Heat Engine

Dedicating three years to the study of heat engines, Carnot’s groundbreaking work culminated in the conceptualization of an ideal heat engine — a machine devoid of friction and losses to the environment. The essence of Carnot’s ideal engine lay in its remarkable reversibility. Each action within the engine had an equal and opposite reaction, rendering it possible to reverse the entire process seamlessly.

The Dance of Energy: The Mechanics of Carnot’s Ideal Engine

Picture two colossal metal bars — one scorching hot and the other chillingly cold. A chamber filled with air connects these bars, allowing heat to flow exclusively from the hot bar to the chamber. Inside the chamber, a piston intricately linked to a flywheel awaits its role. The process commences as the hot bar comes into contact with the chamber, causing the air inside to expand and absorb heat. This expansion thrusts the piston upward, setting the flywheel into motion. The hot bar is subsequently removed, and the air continues to expand, albeit without any additional supply of heat. As a result, the temperature drops until it matches that of the cold bar. Now, the chamber makes contact with the cold bar, and the flywheel propels the piston downward. As the air compresses, heat transfers into the cold bar. With the cold bar removed, the flywheel compresses the gas, increasing its temperature until it is just below that of the hot bar. This cycle repeats indefinitely, with each iteration converting heat from the hot bar into stored energy within the flywheel.

The Paradox of Reversibility: Carnot’s Ideal Engine

What makes Carnot’s ideal engine truly fascinating is its reversibility. Remarkably, the same number of cycles can run in reverse, effectively returning everything to its original state, without requiring any additional input of energy. However, despite its reversibility, the ideal engine’s efficiency falls short of 100%. Efficiency is calculated by dividing the energy gained by the flywheel by the heat input from the hot bar. This efficiency hinges solely upon the temperatures of the hot and cold sides.

Lord Kelvin’s Insight: A Temperature Scale and the Pursuit of Perfection

Lord Kelvin, a prominent figure in the world of physics, recognized the immense potential of Carnot’s ideal engine. He realized that it could serve as the foundation for an absolute temperature scale — an innovation that ultimately led to the creation of the Kelvin scale. By allowing the gas in the engine to expand extensively, approaching a state of minimal molecular motion known as absolute zero, a hypothetical engine with 100% efficiency could be achieved.

Practical Limitations: The Quest for Perfect Efficiency

While the concept of achieving 100% efficiency through Carnot’s ideal engine is intriguing, practical implementation remains an unattainable feat. Infinite temperatures on the hot side or absolute zero on the cold side exist as unachievable realities. Thus, even in the absence of friction or losses to the environment, a heat engine cannot reach perfect efficiency.

Real-World Challenges: The Impediments to Efficiency

During Carnot’s era, steam engines faced limitations that restricted their maximum efficiency to a mere 32%. In practice, their efficiency hovered around a meager 3%. Factors such as friction, heat dissipation, and the absence of constant temperature transfer plagued real-world engines, resulting in less energy being stored in the flywheel. Instead, energy dissipated into various components and escaped into the environment.

The Irreversible Dance of Energy: Dispersal and Diminished Usability

As energy spreads out and becomes dispersed, its concentration diminishes, rendering it less usable. This dissipation of energy is an irreversible process — the total amount of energy remains constant, but its use gradually diminishes.

Decades of Discovery: Rudolf Clausius and the Concept of Entropy

In the realm of thermodynamics, the notion of entropy emerged from the work of German physicist Rudolf Clausius. He built upon the ideas of French engineer Sadi Carnot and developed a way to measure the spread of energy. Clausius coined the term “entropy” to represent this quantity. At its core, entropy describes the distribution of energy. When energy is concentrated in a specific area, such as a hot bar, it is considered low entropy. However, as energy disperses to the surroundings, the walls of a chamber, and an axle, entropy increases. Although the total amount of energy remains constant, its availability to perform work diminishes in this more dispersed form. In 1865, Clausius summarized the first two laws of thermodynamics: the energy of the universe is constant, and the entropy of the universe tends to a maximum. This second law is pivotal in understanding numerous phenomena, from cooling and heating processes to gas expansion and the limitations of perpetual motion machines.

Entropy as a Measure of Disorder and Energy Dispersion

Entropy is commonly associated with the idea of disorder. As energy spreads out, systems become more mixed, random, and less ordered. However, a deeper understanding of entropy reveals it as the natural tendency of energy to disperse. This raises a fundamental question: Why does energy spread out over time? Many laws of physics operate symmetrically in both forward and backward time directions, so how does this clear time dependence emerge? To explore this, let’s examine a simple model involving two metal bars: one hot and one cold.

The Dance of Atoms: Exploring Energy States and Heat Flow

In this model, each bar consists of a few atoms, vibrating with energy packets. The number of packets represents a state, and they can freely move within the lattice structure. Energy packets constantly hop between atoms, resulting in different energy configurations while preserving the total energy of the system. When the cold bar is brought into contact with the hot bar, the energy packets begin to flow between them, creating various configurations. Each unique configuration is equally probable.

A particular instant reveals a surprising observation: the hot bar, initially with seven energy packets, now possesses nine, while the cold bar has only one. Heat seems to have flowed from cold to hot, seemingly contradicting the concept of increasing entropy. However, Ludwig Boltzmann made a crucial insight: although heat flowing from cold to hot is improbable, it is not impossible.

Probability and Entropy: The Dance of Energy Configurations

The key to understanding the seemingly paradoxical heat flow lies in probability. The configuration with nine energy packets in the hot bar has 91,520 possible arrangements, while the configuration with five energy packets in each bar has 627,264 arrangements. The latter configuration, where the energy is more evenly spread, is more than six times as likely. However, when considering all possibilities, there remains a 10.5% chance that the hot bar ends up with more energy packets than it started with. As the system size increases, such as scaling up to 80 atoms per bar and 100 energy packets, the likelihood of the hot bar ending up hotter than before diminishes significantly. In everyday solids with trillions of trillions of atoms and energy packets, heat flowing from cold to hot becomes astronomically improbable.

Entropy, Rubik’s Cubes, and Energy Dissipation

An analogy using a Rubik’s cube can help visualize the concept of entropy. Imagine a solved Rubik’s cube, perfectly ordered. By randomly turning the cube, it becomes increasingly disordered, with numerous possible configurations. The chances of it being almost solved are few, while the chances of it being completely randomized are countless. Each turn takes the cube further away from the unlikely state of being solved, illustrating the natural tendency of energy to spread out and systems to become messier.

From Air Conditioning to Life on Earth: Entropy and Energy Flow

With the understanding that energy tends to spread out and systems become more disordered over time, one might question how air conditioning is possible. Air conditioning cools the interior of a house while heating the exterior, seemingly contradicting the increase in entropy. However, this decrease in entropy in one location is only possible by increasing entropy elsewhere. In the case of air conditioning, the concentrated energy stored in coal at a power plant is released, heating up the power plant and its surroundings. This energy spreads through turbines, electric generators, wires, and ultimately the house, producing waste heat. Any decrease in entropy achieved at the house is overshadowed by the greater increase in entropy required to facilitate the cooling process.

The Sun: A Source of Low Entropy for Life on Earth

In contemplating the persistence of structure and life on Earth despite the overall increase in entropy, we turn to our source of low entropy: the Sun. The Sun provides a continuous stream of concentrated energy to Earth. This energy is more clumped together and contains greater potential for use. Plants capture this energy through photosynthesis, converting it into sugars and fuelling their growth. Animals then consume plants, utilizing this energy to sustain themselves and move. As energy flows through the food chain, it becomes more dispersed and eventually radiated back into space as thermal energy. This process of energy conversion involves transforming fewer, higher-energy photons into a greater number of lower-energy photons. Without the Sun as a source of concentrated energy and a means to discard spread-out energy, life on Earth would not be possible.

Life: An Accelerator of Entropy

Life itself may be a consequence of the second law of thermodynamics. As a highly ordered system, life has the remarkable ability to convert low entropy into high entropy. The presence of cyanobacteria and organic matter, for example, significantly increases the entropy of seawater’s surface layer. Jeremy England suggests that a constant stream of clumped-up energy could favor the development of structures that efficiently dissipate that energy. Over time, this process could lead to the emergence of better energy dissipators and eventually give rise to life. In essence, life thrives by accelerating the natural tendency of energy to disperse.

The Great Dance: Earth, the Sun, and the Universe

While the universe tends toward maximum entropy, structures, and complexity persist on Earth due to the energy influx from the Sun. Earth exists as an open system, continuously receiving low entropy energy and releasing higher entropy energy back into space. The dance between the Sun, Earth, and life sustains the delicate balance, allowing for the existence of distinct hot and cold regions, the preservation of the structure, and the proliferation of living organisms. It is a cosmic interplay that exemplifies the intricate relationship between entropy, energy, and the fundamental processes that shape our world.

Entropy, the measure of disorder or randomness in a system, holds a profound influence over the unfolding of our universe. By examining the concept of entropy, we can gain valuable insights into the arrow of time, the evolution of complex structures, and the ultimate fate of the cosmos. In this captivating blog post, we embark on a journey through time and space to explore the intricate web of entropy and its profound implications.

The Past Hypothesis: Tracing the Origins of Low Entropy

If we consider that the total entropy of the universe is constantly increasing, we can infer that the entropy was lower in the past. This concept, known as the past hypothesis, suggests that the universe unfolded as it has due to its initial low entropy state. However, the question remains: What caused this low entropy state? The answer lies in the early universe, which was hot, dense, and nearly uniform. While everything appeared to be mixed and uniform, gravity played a crucial role in clumping matter together. Considering gravity, the state of matter being spread out as it was during that time becomes an extremely unlikely scenario, leading to a state of low entropy.

The cosmic microwave background, as seen by Europe’s Planck satellite. (Image credit: ESA/Planck Collaboration)

From Expansion to Clumping: The Role of Gravity

As the universe expanded and cooled over time, matter gradually started clumping together in more dense regions. This process converted enormous amounts of potential energy into kinetic energy. Similar to water flowing downhill and powering a turbine, this energy could be harnessed. However, as bits of matter collided, some of their kinetic energy transformed into heat, resulting in a decrease in useful energy. Consequently, entropy increased as the universe evolved. Stars, planets, galaxies, and life itself emerged as a result, all contributing to the continuous rise of entropy.

The Dominance of Black Holes: A Vast Reservoir of Entropy

In 1972, Jacob Bekenstein proposed that black holes possess entropy proportional to their surface area. While some physicists initially dismissed this idea, Stephen Hawking’s groundbreaking research on black holes, specifically the discovery of Hawking radiation and their associated temperatures, confirmed the existence of entropy in these enigmatic cosmic entities. In fact, black holes contain a staggering amount of entropy. The supermassive black hole at the center of the Milky Way alone possesses approximately 10 to 91 Boltzmann constants of entropy. When considering all black holes collectively, they account for a staggering 3 times 10 to 104 Boltzmann constants' worth of entropy. This means that almost all the entropy in the universe is tied up within black holes.

Black Holes
Milky way

The Arrow of Time: Unidirectional Progression

The fact that the entropy of the universe was low in the past and has been increasing ever since explains the existence of an arrow of time — a clear distinction between the past and the future. We never observe an asteroid uncrashing or a planetary system unmixing back into the dust and gas from which it formed. The unidirectional progression from unlikely states to more likely states defines the arrow of time, and it is intimately connected to the rise of entropy.

The Heat Death of the Universe: The Ultimate Fate

As the universe continues to evolve, energy will gradually spread out, leading to a state known as the heat death of the universe. In this distant future, after the last black hole has evaporated, the universe will be in its most probable state. On large scales, it will be impossible to discern the direction of time, and the arrow of time itself will vanish. While this may seem like a bleak outcome, it is essential to recognize that complexity and interest lie not in maximum entropy, which lacks complexity, nor in minimum entropy, but in the middle ground where complex structures arise and thrive.

Embracing the Complexity: Utilizing Low Entropy

While maximum entropy may appear dull, low entropy does not necessarily equate to maximum complexity. Just as pouring milk into tea creates mesmerizing patterns that arise and fade away, complexity emerges from the interplay between low and high entropy. As we find ourselves within this realm of complexity, we have the opportunity to leverage the low entropy we possess. With the right tools and knowledge, we can unravel the mysteries of the universe, from the cooling of a cup of tea to the evolution of the cosmos itself.

The concept of entropy provides us with a profound understanding of the fundamental workings of our universe. From the origins of low entropy in the early universe to the dominance of black holes as reservoirs of entropy, we have delved into the intricate web of energy and disorder. The arrow of time, defined by the increase in entropy, guides us through the progression of the universe, while the ultimate fate of the heat death reminds us of the finite nature of complexity. Yet, within the realm of complexity lies the opportunity to explore and understand the universe with the low entropy we possess. As we embrace this complexity, armed with knowledge and curiosity, we can unravel the mysteries of the cosmos and make the most of the low entropy we have. The dance between energy and entropy continues to shape our world, and it is our task to uncover its secrets and marvel at the intricate interplay that defines our existence.

--

--