Monday, October 7, 2019

Entropy

If you look up the meaning of "Entropy" with a google search you will likely be no more sure of what it is after reading the multitude of descriptions offered. That is not to say they aren't all true, it is just that it is not completely clear what the English words mean. Here are ten examples:

 1. The entropy of an object is a measure of the amount of energy which is unavailable to do work.

 2. Entropy is a measure of the number of possible arrangements the atoms in a system can have.

 3. Entropy is a measure of uncertainty or randomness.

 4. Physics: a thermodynamic quantity representing the unavailability of a system's thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system.

 5. Lack of order or predictability; gradual decline into disorder. "a marketplace where entropy reigns supreme"

 6. Entropy is a measure of the energy dispersal in the system. We see evidence that the universe tends toward highest entropy many places in our lives. A campfire is an example of entropy. The solid wood burns and becomes ash, smoke and gases, all of which spread energy outwards more easily than the solid fuel.

 7. Entropy is a measure of the random activity in a system. The entropy of a system depends on your observations at one moment. How the system gets to that point doesn't matter at all.

 8. Entropy, the measure of a system's thermal energy per unit temperature that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, the amount of entropy is also a measure of the molecular disorder, or randomness, of a system.

 9. Here are two examples:  Low entropy: A carbon crystal structure at a temperature near absolute zero. ... High entropy: A box filled with two elements in their gaseous state, both of which are noble gases, heated to a very high temperature, with the gas "not very dense".

 10. Entropy is one of the consequences of the second law of thermodynamics. The most popular concept related to entropy is the idea of disorder. Entropy is the measure of disorder: the higher the disorder, the higher the entropy of the system. ... This means that the entropy of the universe is constantly increasing.

 So, if you are like me, after reading such prosaic descriptions of what entropy is, you are looking for a mathematical description that satisfies. In particular, something precise, something that you know how to measure. In fact, part of the difficulty with understanding entropy in lay terms is that it is a mathematical construction. Physics uses mathematical constructions to help explain things. Electric fields are an example. What is an electric field? Describing the theory of electricity (and magnetism) in terms of Electric (Magnetic) fields gives us a model we can "picture" in our minds that helps to describe the properties of electrons moving under the influence of uneven charges. Another example of a mathematical construct is energy. What is energy? It is often described as an "ability to do work". That is, energy (whatever it is) is expended in doing work. It is conserved in its many forms. That is, it cannot be created or destroyed. But what is it? A mathematical construct.

 The point is, sometimes, as in the cases of electric fields, energy, and entropy, there are mathematical functions of well known entities which satisfy certain useful properties. By giving names to these functions, we can then use them in our study of those entities. Such mathematical functions are not like the real things that exist in nature like atoms, but they are functions of such real things which prove useful. When we then go back to describe these concepts in non-mathematical terms, they lose the exactness of their true definition. If you really want to understand a mathematical construction, you have to understand the mathematics that gave rise to it.

 I undertook this quest to try to understand what entropy is. I read a book by Enrico Fermi, listened to some lectures from the Khan Academy, reviewed some Calculus, and in the end started to get an idea of what lies beneath the descriptions of entropy such as those above.

 Here's the bottom line: There are two ways to describe the mathematical entity called entropy. The classic version arose out of studies about heat and work. The statistical one arose out of statistical mechanics.

 Classic: The change in entropy of a system when it transforms from equilibrium state A to equilibrium state B is the sum over all the heat sources T of ratios Q/T where Q is the amount of heat the system absorbs from a source at temperature T.

 Statistical: The entropy of a thermodynamical system in equilibrium state A is proportional to the logarithm of the number N of dynamical states that give rise to that thermodynamical state where the constant of proportionality is the ratio of the gas constant R to Avogadro's number A.

 Okay, so now we know what entropy is!! You can read all about it in a summary paper I wrote after my study. Although there is nothing new in the paper, I have tried to iron out a few of the wrinkles and fill in a few gaps I ran across in my readings to make it a little smoother reading for you. The paper is pretty self-contained if you have studied a little chemistry and calculus.



No comments:

Post a Comment