OpenStax Chemistry 2e
The relationships between entropy, microstates, and matter/energy dispersal described previously allow us to make generalizations regarding the relative entropies of substances and to predict the sign of entropy changes for chemical and physical processes. Consider the phase changes illustrated in Figure 1. In the solid phase, the atoms or molecules are restricted to nearly fixed positions with respect to each other and are capable of only modest oscillations about these positions. With essentially fixed locations for the system’s component particles, the number of microstates is relatively small. In the liquid phase, the atoms or molecules are free to move over and around each other, though they remain in relatively close proximity to one another. This increased freedom of motion results in a greater variation in possible particle locations, so the number of microstates is correspondingly greater than for the solid. As a result, Sliquid > Ssolid and the process of converting a substance from solid to liquid (melting) is characterized by an increase in entropy, ΔS > 0. By the same logic, the reciprocal process (freezing) exhibits a decrease in entropy, ΔS < 0.
Now consider the gaseous phase, in which a given number of atoms or molecules occupy a much greater volume than in the liquid phase. Each atom or molecule can be found in many more locations, corresponding to a much greater number of microstates. Consequently, for any substance, Sgas > Sliquid > Ssolid, and the processes of vaporization and sublimation likewise involve increases in entropy, ΔS > 0. Likewise, the reciprocal phase transitions, condensation and deposition, involve decreases in entropy, ΔS < 0.
According to kinetic-molecular theory, the temperature of a substance is proportional to the average kinetic energy of its particles. Raising the temperature of a substance will result in more extensive vibrations of the particles in solids and more rapid translations of the particles in liquids and gases. At higher temperatures, the distribution of kinetic energies among the atoms or molecules of the substance is also broader (more dispersed) than at lower temperatures. Thus, the entropy for any substance increases with temperature (Figure 2).
The entropy of a substance is influenced by the structure of the particles (atoms or molecules) that comprise the substance. With regard to atomic substances, heavier atoms possess greater entropy at a given temperature than lighter atoms, which is a consequence of the relation between a particle’s mass and the spacing of quantized translational energy levels (a topic beyond the scope of this text). For molecules, greater numbers of atoms increase the number of ways in which the molecules can vibrate and thus the number of possible microstates and the entropy of the system.
Finally, variations in the types of particles affect the entropy of a system. Compared to a pure substance, in which all particles are identical, the entropy of a mixture of two or more different particle types is greater. This is because of the additional orientations and interactions that are possible in a system comprised of nonidentical components. For example, when a solid dissolves in a liquid, the particles of the solid experience both greater freedom of motion and additional interactions with the solvent particles. This corresponds to a more uniform dispersal of matter and energy and a greater number of microstates. The process of dissolution, therefore, involves an increase in entropy, ΔS > 0.
Flowers, P., Theopold, K., Langley, R., & Robinson, W. R. (2019, February 14). Chemistry 2e. Houston, Texas: OpenStax. Access for free at: https://openstax.org/books/chemistry-2e
Date Published: December 22, 2016 Publisher: Public Library of Science Author(s): Kristina Crona, Jianzhi Zhang Abstract: Epistasis is a key concept in the theory of adaptation. Indicators of epistasis are of interest for large systems where systematic fitness measurements may not be possible. Some recent approaches depend on information theory. We show that considering shared … Continue reading
Date Published: December 29, 2014 Publisher: Public Library of Science Author(s): Mihály Ormos, Dávid Zibriczky, Giampiero Favato. http://doi.org/10.1371/journal.pone.0115742 Abstract: We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the … Continue reading
Date Published: August 28, 2008 Publisher: Public Library of Science Author(s): Matthias Dehmer, Stephan Borgert, Frank Emmert-Streib, Enrico Scalas. http://doi.org/10.1371/journal.pone.0003079 Abstract: In this paper, we derive entropy bounds for hierarchical networks. More precisely, starting from a recently introduced measure to determine the topological entropy of non-hierarchical networks, we provide bounds for estimating the entropy of hierarchical … Continue reading
Date Published: March 7, 2017 Publisher: Public Library of Science Author(s): Michael Grabchak, Eric Marcon, Gabriel Lang, Zhiyi Zhang, Stefan J. Green. http://doi.org/10.1371/journal.pone.0173305 Abstract: Modern measures of diversity satisfy reasonable axioms, are parameterized to produce diversity profiles, can be expressed as an effective number of species to simplify their interpretation, and come with estimators that … Continue reading