What is Entropy?

Advertisements
Advertisements

Related Posts:


 A portrait of Rudolf Clasius is shown.
Figure 1. (a) Nicholas Léonard Sadi Carnot’s research into steam-powered machinery and (b) Rudolf Clausius’s later study of those findings led to groundbreaking discoveries about spontaneous heat flow processes. Source: OpenStax Chemistry 2e

What is Entropy? (OpenStax Chemistry 2e)

In 1824, at the age of 28, Nicolas Léonard Sadi Carnot (Figure 1) published the results of an extensive study regarding the efficiency of steam heat engines. A later review of Carnot’s findings by Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the reversible heat (qrev) and the kelvin temperature (T). In thermodynamics, a reversible process is one that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change in some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as irreversible.

Similar to other thermodynamic properties, this new quantity is a state function, so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property entropy (S) and defined its change for any process as the following:

The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.

Source:

Flowers, P., Theopold, K., Langley, R., & Robinson, W. R. (2019, February 14). Chemistry 2e. Houston, Texas: OpenStax. Access for free at: https://openstax.org/books/chemistry-2e

Advertisements
Advertisements

Related Research

Research Article: Epistasis and Entropy

Date Published: December 22, 2016 Publisher: Public Library of Science Author(s): Kristina Crona, Jianzhi Zhang Abstract: Epistasis is a key concept in the theory of adaptation. Indicators of epistasis are of interest for large systems where systematic fitness measurements may not be possible. Some recent approaches depend on information theory. We show that considering shared … Continue reading

Research Article: Entropy-Based Financial Asset Pricing

Date Published: December 29, 2014 Publisher: Public Library of Science Author(s): Mihály Ormos, Dávid Zibriczky, Giampiero Favato. http://doi.org/10.1371/journal.pone.0115742 Abstract: We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the … Continue reading

Research Article: Entropy Bounds for Hierarchical Molecular Networks

Date Published: August 28, 2008 Publisher: Public Library of Science Author(s): Matthias Dehmer, Stephan Borgert, Frank Emmert-Streib, Enrico Scalas. http://doi.org/10.1371/journal.pone.0003079 Abstract: In this paper we derive entropy bounds for hierarchical networks. More precisely, starting from a recently introduced measure to determine the topological entropy of non-hierarchical networks, we provide bounds for estimating the entropy of hierarchical … Continue reading

Research Article: The generalized Simpson’s entropy is a measure of biodiversity

Date Published: March 7, 2017 Publisher: Public Library of Science Author(s): Michael Grabchak, Eric Marcon, Gabriel Lang, Zhiyi Zhang, Stefan J. Green. http://doi.org/10.1371/journal.pone.0173305 Abstract: Modern measures of diversity satisfy reasonable axioms, are parameterized to produce diversity profiles, can be expressed as an effective number of species to simplify their interpretation, and come with estimators that … Continue reading