Grokking Thermodynamics

©Fernando Caracena 2014

Heat is a form of energy

The study of thermodynamics is old, heuristic and enters so many corners of human effort that heat is measured many different ways, resulting in a confusing number of units. At first, physicists did not know what heat was; they simply related the phenomenon to the sensation of hot and cold. The basic observation was that if you put a hot object in direct contact with a cold object, the hot one will cool off and the cold one will warm up (Fig.1). People working with theat thought of it as a kind of fluid that flows from one body (block1) to another (block2). After a while, the two bodies will feel the same degree of warmness. If they are in an insulated environment, these two objects will approach and remain at the same degree of heat sensation.

Not wishing to trust the mere sensation of heat, which is unreliable, physicists wanted a more objective way of measuring the degree of heat transfer to an object. The development of the thermometer gave physicists a quantitative way to measure heat. Thermometers had been around as far back as the time of Galileo, but they were qualitative rather than quantitative instruments. The physicist Daniel Gabriel Fahrenheit (1686–1736) developed the first reliable, quantitative thermometer (see video). Since then, the Fahrenheit temperature scale (F) has been replaced by several others, such as Celsius (C) and Kelvin (K), which are not described here in detail, but on which you can get more information in Wikipedia. This discussion uses maths to discuss the concepts of thermodynamics as much as possible without involving the units of measure and conversion factors between them. Units are selected and added near the end for the purpose of making the results useful in a broader context than just understanding them.

Observations

Objective measurements of temperature (T) verified the conclusions drawn by experimenters' earlier ideas based on the sensation of heat, but the numbers gave quantitative results. The primary rule is that heat flows from one block (at a temperature, T1) to another (at a temperature, T2) if and only if T1 > T2 , and the flow of heat stops when T1 = T2 . Thereafter, both blocks remain at the same temperature if they are thermally isolated from everything else in the universe.

In other words, heat flows from the first block to the second if, and only if

T1 - T2 > 0 .                                                                    (1)

Measuring heat

In 1798 the American born British physicist, Benjamin Thompson(Count Rumford), noticed that the process of boring a canon generated a lot of heat, enough to raise the temperature of the canon to the boiling point of water. He published his observations in a paper entitled,

"Heat is a Form of Motion: An Experiment in Boring Cannon" in the journal Philosophical Transactions (vol. 88), 1798. "

Fig. 2 Heating water at a constant energy input from a water ice mixture to boiling water.

In this article he concluded that heat must be some form of motion. The English physicist James Prescott Joule (1818–1889) following up Thompson's observations found that indeed a measured amount of energy input into a thermally isolated piece of matter, always supplied it with an equivalent amount of heat. In other words, heat is a form of energy. Basically, this observation goes by the name of the First Law of Thermodynamics.

Calorimetrymeasuring heat

Today, all of the above mentioned discoveries can be reproduced in simple physics experiments in a small laboratory. The results of a simple experiment (Fig.2) in which energy is added to a body of water at a constant rate, say by an immersion, electric heater, shows that the temperature of the water increases at a constant rate, if there is no phase transition. As you know, water can exist in three phases: as ice, liquid and vapor (or gas). By carefully and slowly heating a water ice mixture, you will find that the temperature of the ice water mixture remains constant (at the freezing point) until all of the ice melts. After that, the constant heat energy input raises the temperature of the liquid water at a constant rate (2a and b).

dQ/dt = C dT/dt ,                                                           (2a)

dQ =C dT ,                                                                       (2b)

where C is the constant of proportionality obtained from the slope of the temperature line in the middle of (Fig. 2), which relates to the the properties of the liquid phase of water. This constant is also called the heat capacity of the substance being heated. Note that the heating rate  dQ/dt is measured directly by the electric power used by the immersion heater, which in this case is entirely converted to heat.

The conclusion of this part of the experiment is that the rate of increase in temperature is directly proportional to the rate of input of energy. [Note that units are not introduced here, to avoid confusing the issue. The concept is so simple and quantitative that we can at this point bypass a technical discussion of units and conversion factors.]

The heating curve (Fig.2) again flattens when water reaches the boiling point of water, where instead of raising the temperature, the additional energy input simply boils away more water, until all the water is converted to vapor. In the gaseous state, water vapor follows a similar linear increase in temperature under a constant rate of energy input, but the experimenter must use a way of heating the gas that distributes heat over its entire volume without generating convection currents (thermals).

For the sake of completion,here I should mention that the energy input that does not raise the temperature at the two phase transitions depicted in Fig.2  goes into melting of ice at a linear rate by mass or, by vaporization of boiling away of liquid water, also at a linear rate by mass.

dm/dt=L dQ/dt.                                                             (2c)

Note that there are two constants under the category L in Fig. 2(L1 , L2), one at the freezing point (heat of fusion) an the other boiling point of water (heat of vaporization). Also, the two constants have different values, the heat of vaporization being much larger than that of fusion.

---------

--------

Entropy

-----

Even before physicists realized that heat is a form of energy, people used heat energy to do work for them, which eventually led to the Industrial Revolution. Even as early as the First Century A D, Greeks developed devices that used steam power. Actual industrial applications came later.   Jerónimo de Ayanz y Beaumont a Spaniard from Navarra patented the first steam engine in 1606. Thomas Savery patented a steam pump in 1698 that used steam in direct contact with the water being pumped."

The French military engineer and physicist, Nicolas Léonard Sadi Carnot (1 June 1796 - 24 August 1832) showed that the basic limitation on the direction of flow of heat would result in limiting the efficiency of a heat engine (See Wikipedia: Carnot Cycle).  Further theoretical work on the properties of heat led to some rather astounding conclusions involving that concept of entropy.

The following bird's eye view reveals some of the interesting properties of entropy without involving the discussion in confusing details.

In differential form, entropy (dS) gained by an object was defined as,

dS = dQ/T.                                                                     (3)

The entropy change of the object can be either positive or negative depending on whether heat is transferred to it (+dQ) or transferred away from it (-dQ).

The total entropy change in some thermodynamic process, which can be complicated, is given by the integral of dS between the two temperatures or states of a thermodynamic system.

Now, let us look at the system depicted in  Fig. 1 where two blocks at different temperatures T1> T2 are brought into thermal contact, but which are otherwise thermally isolated from the rest of the universe.

Divide (1) by the product T1* T2 and you get

1/ T2- 1/ T1 > 0 ,                                                            (4)

as the necessary condition for heat to flow from block1 to block2.

Multiply (4) by dQ, the differential amount of heat transferred from block1 to block2

dST > 0                                                                           (5a)

to get the net entropy gain in the two block system,

where

dST = dS1 +dS2 .                                                            (5b)

The incremental, net change of entropy in the two system is shown by (5a) to be positive, although block1 loses entropy and block2 gains it, since

dS1 = - dQ/ T1                                                                 (5c)

and

dS2 =  dQ/ T2.                                                                 (5d)

The total entropy change of an isolated system is the sum of the entropy changes of the components of the system. In this case block1 cools, so that its entropy decreases (5c) as block2 warms, so that its entropy increases (5d); but there is a net increase of entropy as a result of this heat transfer, which follows logically from (4) and (3).

A notable feature of this process is that through the process of heat transfer the net entropy of a system increases.

The net increase of entropy under heat transfer between components of a thermally isolated system, is a universal feature of entropy. We can draw two conclusions from this discussion.

Conclusion #1: the observation that heat always flows from the warmer object toward the colder object in thermal contact, and never the reverse leads logically to the conclusion that the entropy of an isolated system never decreases, but if it changes at all, must increase. This is known as the Second Law of Thermodynamics.

Conclusion #2:the increase in entropy of an isolated system approaching thermal equilibrium defines a direction of the passage of time, which is known as "the Arrow of Time", because the total entropy of the system at various slices of time orders all of the frames in time by its monotonic increase, no matter how these frames may be shuffled are shuffled.

Final thoughts—thinking beyond the maths

One way physics reacts on philosophy is through thinking beyond the equations, which makes physics an important contributor to human thought. This is one reward one gets from having waded through a jungle of equations. At some bench mark, the adventurer in thought can look back and form some solid conclusions about the nature of our physical existence.

Entropy turns out to be one of those key concepts that reveals a fantastic view of the network of pre-existing, physical laws, with which it contrasts in fundamental ways. Thermodynamics is a branch of physics that at first sight does not resemble the other branches of physics, which requires a lot of changes in background ideas and interpretations in physics just to fit into its whole body of theory.

An important property related to entropy is that the increase in entropy is relentless, which introduces irreversibility into the laws of physics through the Arrow of Time. Previous to the development of thermodynamics, all the fundamental laws of physics were insensitive to the direction of time. The equations allowed for a twin system for every system solution of a set of equations of motion, which operated in exactly a time-reversed manner. The equations of physics described the motion of systems for which there was no preferred time direction; but no one had yet dared to ask why, although it was common experience that there was an arrow of time, for example the Humpty Dumpty effect. Just drop an egg, do you ever see the reverse of what happens as a result of the fall of that egg, except in running a video of the event backward? The time-reversed video appears ludicrous. The Arrow of time was easy to ignore when it was a qualitative effect; but, when the Arrow of Time entered the laws of physics as a quantitative effect, the disparity of thermodynamics with the rest of physics forced a big gulp on the whole community. Physicists had to do a whole lot of rethinking about the nature of physical reality in order to be able to deal with the concept of entropy.

Entropy is a sticky concept, the "tar baby" of physics. In reality there is no truly isolated system in the universe. Thermal insulation may offer a big slow down in a system's approaching thermal equilibrium with its larger-scale environment, so that it behaves approximately like an isolated system for some period of time, but eventually in time, that system will begin to approach some thermal balance with its surroundings. In time the surroundings of all systems appear to grow to the size of the entire universe resulting in a universal heat death; but maybe not. Meanwhile, life and human pursuits are all made possible by processes in the universe that operate far from thermal equilibrium. The implication of this is that the entropy of the entire universe is increasing with time. The entire universe is subject to the Arrow of Time, and the Arrow of Time is enforce by thermal imbalances. This means that time was introduced at the time of the Big Bang by large thermal imbalances, so that thermodynamics sets some strong constraints on cosmology.

Consider how the arrow of time gets into mechanical processes. Drop a ball straight down onto a smooth, hard, horizontal surface. The ball will bounce a to certain height above the ground which is a fraction of the height you dropped it from. The more energy is retained by the ball after each bounce, the higher will be the rebound as a percentage of the original drop height. The fractional energy remaining after each bounce is determined by the type of material in the ball. For example, super balls bounce almost s high as they were originally dropped from. It is natural to ask where the energy loss goes after each bounce, and the answer is that it goes from the external motion of the entire ball into the internal motions of the microscopic constituents of both the ball and the surface, from which it bounces. In other words, during the bounce, some of the energy of the ball is converted to heat. Once the energy of large-scale motion is turned into heat, that heat diffuses out in all directions increasing the total entropy of the universe. The process is irreversible. Through heat generating degradation of mechanical motion, entropy and the Arrow of Time come into play in accounting for the properties of real mechanical systems.

Entropy and order

In the Humpty Dumpty effect, you begin with a whole egg in a whole shell at a fixed height above the ground. The total energy of the egg that sits at that height is equal to its potential energy referenced to ground level. When the egg is released, it falls with a constant energy that consists of kinetic energy plus its potential energy. Were it to rebound perfectly off the ground without any loss of mechanical energy, it would bounce back up to the exact height, from which it fell. In the real world, the egg shatters, its contents splattering over the ground, the energy of the parts is completely dissipated in a very short time. It is natural to ask, "Where does that energy go?" We now know that it becomes heat energy, which raises the temperature of the impact area and egg debris. That increased temperature causes an outward diffusion of excess heat that increases the entropy of the surroundings and eventually the entire universe. And of course, all of the king's horses and all of the king's men can never put Humpty Dumpty back together again. In ordinary events, such as the splattering of an egg, we do not notice the conversion of mechanical energy into heat; however on an astronomical scale, we would certainly notice it. In the collision of two planet sized objects, we would see rocks melted and heated to incandescence, etc.

In the above example of the Humpty Dumpty effect, we notice that the entropy of the universe is increased in the splattering of an egg at the same time that the ordering of the parts and motion of the egg are destroyed. In the egg universe, there is a decrease in the egg order concomitant with the increase in entropy. For this and many other such reasons, entropy is often called a measure of disorder in a system. The distribution of heat in two blocks at different temperature is a more ordered state than the two blocks in thermal equilibrium, which has an increased entropy for the two block system. The increase in entropy is a measure of the degree of disordering of a physical system. We will discuss this and other interesting properties of entropy in future blogs.

If you are interested in learning more on the subject, Prof. Susskind has an online series of lectures on Statistical Mechanics.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *