Astrophysics (Index)About

entropy

(S)
(relative measure of the amount of usable energy available)

The concept of entropy was devised in the field of thermodynamics as a measure of the amount of usable energy available. It has since been redefined by statistical mechanics, covering the same ground but generalizing the concept to make it directly applicable to more circumstances.

Entropy is not a direct measure of such energy: rather entropy increases as energy becomes unusable. It is changes (deltas) in entropy that are generally considered rather than totals: the difference from before to after some process is often what can be calculated. When energy of a system is considered, often it is just some of the types (such as kinetic energy and/or some type of potential energy) rather than all the energy (as in e=mc²) and considerations of a system's entropy are similarly qualified. Thermodynamics was developed as a science of heat energy (a type of kinetic energy), and entropy associated with other energy was dealt with by considering an equivalent heat-energy situation.

Thermodynamics conceives energy as something never destroyed. Energy is clearly what can produce mechanical work, such as lifting something, and a hot object has energy since it can heat some gas, making the gas expand, and the expansion can be used to drive a piston or turbine, which can power such mechanical work. If, instead, the hot object is simply allowed to cool, the heat hasn't disappeared but merely spread out. Even if the work is done, any friction in the mechanical system similarly spreads out some heat. Once the object has reached equilibrium with its environment, the now-spread-out heat's ability to do this work in this environment is gone: the gas, as part of the environment, now is already the same temperature as the object. The heat (and thus, energy) still exists but is no longer usable.

Thermodynamic entropy (S) is defined as the substance's heat energy divided by its temperature, or rather entropy's change is the difference of this quantity before and after some process. Without some outside interaction, such entropy never diminishes, i.e., energy never grows more usable under such conditions. Thermodynamics is based upon treating this as a law of nature.

The statistical mechanics concept of entropy is that it is a measure of the disorder of a system. Intuitively, the connection with the thermodynamic concept is this: a pair of tanks, one of hot gas and the other cold, constituting usable energy as described above, represent a bit of orderliness. When their gas is mixed and all the same temperature, that particular orderliness and the usability of the energy are gone. It is due to statistics that molecules cannot be expected to re-separate spontaneously into hot (faster molecules) and cold (slower) regions and slower molecules) so as to become usable again. Other than the statistical improbability, it is perfectly possible for the faster and slower molecules to find themselves in separate regions of the tank purely through happenstance, but a calculated timescale for such an occurrence makes the age of the universe look miniscule.


(physics,measure)
Further reading:
https://en.wikipedia.org/wiki/Entropy
https://en.wikipedia.org/wiki/Entropy_(classical_thermodynamics)
https://en.wiktionary.org/wiki/entropy
https://chemistrytalk.org/what-is-entropy/
https://chem.libretexts.org/Bookshelves/Physical_and_Theoretical_Chemistry_Textbook_Maps/Supplemental_Modules_(Physical_and_Theoretical_Chemistry)/Thermodynamics/Energies_and_Potentials/Entropy
http://www.scholarpedia.org/article/Entropy

Referenced by pages:
Big Crunch
black hole thermodynamics
entropic gravity
Geroch-Bekenstein engine
large scale structure (LSS)

Index