Menu Top




Additional: Introduction to Entropy



Concept of Entropy ($ \Delta S = \frac{Q}{T} $ for reversible processes)

While temperature, pressure, and volume are macroscopic properties that describe the state of a system, entropy ($S$) is another fundamental thermodynamic state variable that provides insights into the spontaneity of processes and the distribution of energy. It is often described as a measure of the disorder or randomness of a system, or more precisely, the number of possible microscopic arrangements corresponding to a given macroscopic state.


Definition of Entropy Change (Thermodynamic Definition)

The concept of entropy was introduced by Rudolf Clausius in the mid-19th century based on the study of cyclic processes and the Second Law of Thermodynamics. The change in entropy ($\Delta S$) for a system undergoing a reversible thermodynamic process is defined as the ratio of the heat transferred ($Q_{rev}$) during the process to the absolute temperature ($T$) at which the transfer occurs.

$ \Delta S = \frac{Q_{rev}}{T} $

For an infinitesimal reversible process occurring at temperature $T$, the change in entropy is $ dS = \frac{dQ_{rev}}{T} $. The total change in entropy during a finite reversible process from state 1 to state 2 is:

$ \Delta S_{1 \to 2} = \int_1^2 \frac{dQ_{rev}}{T} $

This definition is specifically for reversible processes. The change in entropy for an irreversible process is calculated by considering a hypothetical reversible path between the same initial and final states, because entropy is a state function and its change depends only on the endpoints.

The SI unit of entropy is Joules per Kelvin (J/K). Entropy is an extensive property; it depends on the amount of substance.


Microscopic Interpretation of Entropy (Boltzmann's Definition)

Ludwig Boltzmann later provided a statistical or microscopic interpretation of entropy. He related the entropy of a state to the number of microscopic configurations (microstates) that correspond to a given macroscopic state (macrostate). This number is called the thermodynamic probability ($\Omega$) or the number of accessible microstates.

Boltzmann's formula for entropy is:

$ S = k_B \ln \Omega $

where $k_B$ is the Boltzmann constant, and $\ln\Omega$ is the natural logarithm of the number of microstates corresponding to the macrostate. A state with higher $\Omega$ has higher entropy.

This definition connects the macroscopic property of entropy to the microscopic arrangement of particles. States with more ways to arrange the molecules (more disorder or randomness) have higher entropy. For example, a gas has much higher entropy than a solid at the same temperature because the gas molecules have much greater freedom of position and motion.


Entropy as a State Function

One of the key consequences of the Second Law of Thermodynamics is that $\int \frac{dQ_{rev}}{T}$ is path-independent for a reversible process between two states. This implies that entropy is a state function, just like internal energy, enthalpy, pressure, volume, and temperature. The change in entropy between two states depends only on the initial and final states, regardless of the reversible path taken. $\Delta S = S_{final} - S_{initial}$.



Entropy and the Second Law ($ \Delta S \ge 0 $ for isolated systems)

The most general and fundamental statement of the Second Law of Thermodynamics is in terms of entropy. It dictates the direction of spontaneous processes and introduces the concept of increasing disorder in the universe.


Statement of the Second Law in terms of Entropy

The Second Law of Thermodynamics, formulated using entropy, states:

The total entropy of an isolated system (system + surroundings) can never decrease over time. It remains constant for a reversible process and increases for an irreversible process.

Mathematically, for an isolated system:

$ \Delta S_{total} = \Delta S_{system} + \Delta S_{surroundings} \ge 0 $

This implies that the entropy of the universe is constantly increasing. This tendency towards increasing entropy in isolated systems explains why processes have a natural direction – heat flows from hot to cold, gases mix, ordered structures tend to break down into disorder – these are all processes that lead to an increase in total entropy.


Heat Death of the Universe

The concept of the continuous increase in the entropy of the universe has led to the speculative idea of the "heat death of the universe". This hypothesizes that eventually, the universe will reach a state of maximum entropy, where everything is at a uniform temperature, there are no temperature differences to drive heat engines, and no energy transformations (work) are possible. At this point, the universe would be in a state of thermodynamic equilibrium, with no further macroscopic changes occurring.


Entropy and Unavailable Energy

Entropy increase is related to the degradation of energy quality or the increase in the portion of energy that is unavailable to do useful work. When an irreversible process occurs (entropy increases), some energy is converted into a form that is less ordered and less useful for performing work, often dissipated as low-temperature heat.

While the First Law says energy is conserved (you don't lose energy), the Second Law (via entropy) says that the useful work you can get from that energy decreases in irreversible processes.



Third Law of Thermodynamics

The Third Law of Thermodynamics provides a reference point for the absolute value of entropy. It deals with the behaviour of systems as they approach absolute zero temperature.


Statement of the Third Law

The Third Law states:

The entropy of a perfect crystalline substance is zero at absolute zero temperature (0 Kelvin).

A "perfect crystalline substance" is one where the atoms or molecules are arranged in a perfectly ordered lattice structure with no defects or randomness in their positions or orientations. At absolute zero, the particles are in their lowest energy state, and their motion is minimum (only zero-point energy vibrations are present). For a perfect crystal at 0 K, there is only one possible microscopic arrangement corresponding to this state ($\Omega = 1$), and since $\ln(1) = 0$, the entropy $S = k_B \ln(1) = 0$.


Implications of the Third Law

The Third Law completes the set of fundamental laws of thermodynamics, providing a full framework for understanding energy, heat, work, temperature, and entropy, and their interrelationships and limitations.