11.10 En­tropy

With the clever­est in­ven­tors and the great­est sci­en­tists re­lent­lessly try­ing to fool na­ture and cir­cum­vent the sec­ond law, how come na­ture never once gets con­fused, not even by the most com­pli­cated, con­vo­luted, un­usual, in­ge­nious schemes? Na­ture does not out­wit them by out-think­ing them, but by main­tain­ing an ac­count­ing sys­tem that can­not be fooled. Un­like hu­man ac­count­ing sys­tems, this ac­count­ing sys­tem does not as­sign a mon­e­tary value to each phys­i­cal sys­tem, but a mea­sure of messi­ness called en­tropy. Then, in any trans­ac­tion within or be­tween sys­tems, na­ture sim­ply makes sure that this en­tropy is not be­ing re­duced; what­ever en­tropy one sys­tem gives up must al­ways be less than what the other sys­tem re­ceives.

So what can this nu­mer­i­cal grade of messi­ness called en­tropy be? Surely, it must be re­lated some­how to the sec­ond law as stated by Clau­sius and Kelvin and Planck, and to the re­sult­ing Carnot en­gines that can­not be beat. Note that the Carnot en­gines re­late heat added to tem­per­a­ture. In par­tic­u­lar an in­fin­i­tes­i­mally small Carnot en­gine would take in an in­fin­i­tes­i­mal amount $\delta{Q}_{\rm {H}}$ of heat at a tem­per­a­ture $T_{\rm {H}}$ and give up an in­fin­i­tes­i­mal amount $\delta{Q}_{\rm {L}}$ at a tem­per­a­ture $T_{\rm {L}}$. This is done so that $\delta{Q}_{\rm {H}}$$\raisebox{.5pt}{$/$}$$\delta{Q}_{\rm {L}}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $T_{\rm {H}}$$\raisebox{.5pt}{$/$}$$T_{\rm {L}}$, or sep­a­rat­ing the two ends of the de­vice, $\delta{Q}_{\rm {H}}$$\raisebox{.5pt}{$/$}$$T_{\rm {H}}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\delta{Q}_{\rm {L}}$$\raisebox{.5pt}{$/$}$$T_{\rm {L}}$. The quan­tity $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$ is the same at both sides, ex­cept that one is go­ing in and the other out. Might this, then, be the change in messi­ness? Af­ter all, for the ideal re­versible ma­chine no messi­ness can be cre­ated, oth­er­wise in the re­versed process, messi­ness would be re­duced. What­ever in­crease in messi­ness one side re­ceives, the other side must give up, and $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$ fits the bill for that.

If $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$ gives the in­fin­i­tes­i­mal change in messi­ness, ex­cuse, en­tropy, then it should be pos­si­ble to find the en­tropy of a sys­tem by in­te­gra­tion. In par­tic­u­lar, choos­ing some ar­bi­trary state of the sys­tem as ref­er­ence, the en­tropy of a sys­tem in ther­mal equi­lib­rium can be found as:

\begin{displaymath}
\fbox{$\displaystyle
S \equiv S_{\rm ref} +
\int_{\rm ref...
...\frac{\delta Q}T \qquad \mbox{along any reversible path}
$} %
\end{displaymath} (11.18)

Fig­ure 11.14: Com­par­i­son of two dif­fer­ent in­te­gra­tion paths for find­ing the en­tropy of a de­sired state. The two dif­fer­ent in­te­gra­tion paths are in black and the yel­low lines are re­versible adi­a­batic process lines.
\begin{figure}\centering
\setlength{\unitlength}{1pt}
\begin{picture}(300,11...
...1){\circle*{3}}
\put(-41,80.6){\makebox(0,0)[b]{C}}
\end{picture}
\end{figure}

The en­tropy as de­fined above is a spe­cific num­ber for a sys­tem in ther­mal equi­lib­rium, just like its pres­sure, tem­per­a­ture, par­ti­cle den­sity, and in­ter­nal en­ergy are spe­cific num­bers. You might think that you could get a dif­fer­ent value for the en­tropy by fol­low­ing a dif­fer­ent process path from the ref­er­ence state to the de­sired state. But the sec­ond law pre­vents that. To see why, con­sider the pres­sure-vol­ume di­a­gram in fig­ure 11.14. Two dif­fer­ent re­versible processes are shown lead­ing from the ref­er­ence state to a de­sired state. A bun­dle of re­versible adi­a­batic process lines is also shown; those are graph­i­cal rep­re­sen­ta­tions of processes in which there is no heat ex­change be­tween the sys­tem and its sur­round­ings. The bun­dle of adi­a­batic lines chops the two process paths into small pieces, of al­most con­stant tem­per­a­ture, that pair­wise have the same value of $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$. For, if a piece like AB would have a lower value for $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$ than the cor­re­spond­ing piece CD, then a heat en­gine run­ning the cy­cle CD­BAC would lose less of the heat $\delta{Q}_{\rm {H}}$ at the low tem­per­a­ture side than the Carnot ideal, hence have a higher ef­fi­ciency than Carnot and that is not pos­si­ble. Con­versely, if AB would have a higher value for $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$ than CD, then a re­frig­er­a­tion de­vice run­ning the cy­cle AB­DCA would re­move more heat from the low side than Carnot, again not pos­si­ble. So all the lit­tle seg­ments pair­wise have the same value for $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$, which means the com­plete in­te­grals must also be the same. It fol­lows that the en­tropy for a sys­tem in ther­mal equi­lib­rium is uniquely de­fined.

So what hap­pens if the ref­er­ence and fi­nal states are still the same, but there is a slight glitch for a sin­gle seg­ment AB, mak­ing the process over that one seg­ment ir­re­versible? In that case, the heat en­gine ar­gu­ment no longer ap­plies, since it runs through the seg­ment AB in re­versed or­der, and ir­re­versible processes can­not be re­versed. The re­frig­er­a­tion cy­cle ar­gu­ment says that the amount of heat $\delta{Q}$ ab­sorbed by the sys­tem will be less; more of the heat $\delta{Q}$ go­ing out at the high tem­per­a­ture side CD will come from the work done, and less from the heat re­moved at the cold side. The fi­nal en­tropy is still the same, be­cause it only de­pends on the fi­nal state, not on the path to get there. So dur­ing the slight glitch, the en­tropy of the sys­tem in­creased more than $\delta{Q}$$\raisebox{.5pt}{$/$}$$T$. In gen­eral:

\begin{displaymath}
\fbox{$\displaystyle
{\rm d}S \mathrel{\raisebox{-1pt}{$\geqslant$}}\frac{\delta Q}{T}
$} %
\end{displaymath} (11.19)

where = ap­plies if the change is re­versible and $\raisebox{.3pt}{$>$}$ if it is not.

Note that the above for­mula is only valid if the sys­tem has an def­i­nite tem­per­a­ture, as in this par­tic­u­lar ex­am­ple. Typ­i­cally this is sim­ply not true in ir­re­versible processes; for ex­am­ple, the in­te­rior of the sys­tem might be hot­ter than the out­side. The real im­por­tance of the above for­mula is to con­firm that the de­fined en­tropy is in­deed a mea­sure of messi­ness and not of or­der; re­versible processes merely shuf­fle en­tropy around from one sys­tem to the next, but ir­re­versible processes in­crease the net en­tropy con­tent in the uni­verse.

So what about the en­tropy of a sys­tem that is not in ther­mal equi­lib­rium? Equa­tion (11.18) only ap­plies for sys­tems in ther­mal equi­lib­rium. In or­der for na­ture not to be­come con­fused in its en­tropy ac­count­ing sys­tem, surely en­tropy must still have a nu­mer­i­cal value for non­equi­lib­rium sys­tems. If the prob­lem is merely tem­per­a­ture or pres­sure vari­a­tions, where the sys­tem is still in ap­prox­i­mate ther­mal equi­lib­rium lo­cally, you could just in­te­grate the en­tropy per unit vol­ume over the vol­ume. But if the sys­tem is not in ther­mal equi­lib­rium even on macro­scop­i­cally small scales, it gets much more dif­fi­cult. For ex­am­ple, air cross­ing a typ­i­cal shock wave (sonic boom) ex­pe­ri­ences a sig­nif­i­cant in­crease in pres­sure over an ex­tremely short dis­tance. Bet­ter bring out the quan­tum me­chan­ics trick box. Or at least mol­e­c­u­lar dy­nam­ics.

Still, some im­por­tant gen­eral ob­ser­va­tions can be made with­out run­ning to a com­puter. An “iso­lated” sys­tem is a sys­tem that does not in­ter­act with its sur­round­ings in any way. Re­mem­ber the ex­am­ple where the air in­side a room was col­lected and neatly put in­side a glass? That was an ex­am­ple of an iso­lated sys­tem. Pre­sum­ably, the doors of the room were her­met­i­cally sealed. The walls of the room are sta­tion­ary, so they do not per­form work on the air in the room. And the air comes rush­ing back out of the glass so quickly that there is re­ally no time for any heat con­duc­tion through the walls. If there is no heat con­duc­tion with the out­side, then there is no en­tropy ex­change with the out­side. So the en­tropy of the air can only in­crease due to ir­re­versible ef­fects. And that is ex­actly what hap­pens: the air ex­plod­ing out of the glass is highly ir­re­versible, (no, it has no plans to go back in), and its en­tropy in­creases rapidly. Quite quickly how­ever, the air spreads again out over the en­tire room and set­tles down. Be­yond that point, the en­tropy re­mains fur­ther con­stant.

An iso­lated sys­tem evolves to the state of max­i­mum pos­si­ble en­tropy and then stays there.
The state of max­i­mum pos­si­ble en­tropy is the ther­mo­dy­nam­i­cally sta­ble state a sys­tem will as­sume if left alone.

A more gen­eral sys­tem is an “adi­a­batic” or “in­su­lated” sys­tem. Work may be per­formed on such a sys­tem, but there is still no heat ex­change with the sur­round­ings. That means that the en­tropy of such a sys­tem can again only in­crease due to re­versibil­ity. A sim­ple ex­am­ple is a ther­mos bot­tle with a cold drink in­side. If you con­tinue shak­ing this ther­mos bot­tle vi­o­lently, the cold drink will heat up due to its vis­cos­ity, its in­ter­nal fric­tion, and it will not stay a cold drink for long. Its en­tropy will in­crease while you are shak­ing it.

The en­tropy of adi­a­batic sys­tems can only in­crease.
But, of course, that of an open sys­tem may not. It is the recipe of life, {N.25}.

You might won­der why this book on quan­tum me­chan­ics in­cluded a con­cise, but still very lengthy clas­si­cal de­scrip­tion of the sec­ond law. It is be­cause the ev­i­dence for the sec­ond law is so much more con­vinc­ing based on the macro­scopic ev­i­dence than on the mi­cro­scopic one. Macro­scop­i­cally, the most com­plex sys­tems can be ac­cu­rately ob­served, mi­cro­scop­i­cally, the quan­tum me­chan­ics of only the most sim­plis­tic sys­tems can be rig­or­ously solved. And whether we can ob­serve the so­lu­tion is still an­other mat­ter.

How­ever, given the macro­scopic fact that there re­ally is an ac­count­ing mea­sure of messi­ness called en­tropy, the ques­tion be­comes what is its ac­tual mi­cro­scopic na­ture? Surely, it must have a rel­a­tively sim­ple ex­pla­na­tion in terms of the ba­sic mi­cro­scopic physics? For one, na­ture never seems to get con­fused about what it is, and for an­other, you re­ally would ex­pect some­thing that is clearly so fun­da­men­tal to na­ture to be rel­a­tively es­thetic when ex­pressed in terms of math­e­mat­ics.

And that thought is all that is needed to guess the true mi­cro­scopic na­ture of en­tropy. And guess­ing is good, be­cause it gives a lot of in­sight why en­tropy is what it is. And to en­sure that the fi­nal re­sult is re­ally cor­rect, it can be cross checked against the macro­scopic de­f­i­n­i­tion (11.18) and other known facts about en­tropy.

The first guess is about what phys­i­cal mi­cro­scopic quan­tity would be in­volved. Now mi­cro­scop­i­cally, a sim­ple sys­tem is de­scribed by en­ergy eigen­func­tions $\psi^{\rm S}_q$, and there is noth­ing messy about those. They are the sys­tem­atic so­lu­tions of the Hamil­ton­ian eigen­value prob­lem. But these eigen­func­tions have prob­a­bil­i­ties $P_q$, be­ing the square mag­ni­tudes of their co­ef­fi­cients, and they are a dif­fer­ent story. A sys­tem of a given en­ergy could in the­ory ex­ist neatly as a sin­gle en­ergy eigen­func­tion with that en­ergy. But ac­cord­ing to the fun­da­men­tal as­sump­tion of quan­tum sta­tis­tics, this sim­ply does not hap­pen. In ther­mal equi­lib­rium, every sin­gle en­ergy eigen­func­tion of the given en­ergy achieves about the same prob­a­bil­ity. In­stead of na­ture neatly leav­ing the sys­tem in the sin­gle eigen­func­tion it may have started out with, it gives every Johnny-come-lately state about the same prob­a­bil­ity, and it be­comes a mess.

If the sys­tem is in a sin­gle eigen­state for sure, the prob­a­bil­ity $P_q$ of that one eigen­state is one, and all oth­ers are zero. But if the prob­a­bil­i­ties are equally spread out over a large num­ber, call it $N$, of eigen­func­tions, then each eigen­func­tion re­ceives a prob­a­bil­ity $P_q$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1/$N$. So your sim­plest thought would be that maybe en­tropy is the av­er­age value of the prob­a­bil­ity. In par­tic­u­lar, just like the av­er­age en­ergy is ${\sum}P_q{\vphantom' E}^{\rm S}_q$, the av­er­age prob­a­bil­ity would be ${\sum}P_q^2$. It is al­ways the sum of the val­ues for which you want the av­er­age times their prob­a­bil­ity. You sec­ond thought would be that since ${\sum}P_q^2$ is one for the sin­gle eigen­func­tion case, and 1$\raisebox{.5pt}{$/$}$$N$ for the spread out case, maybe the en­tropy should be $-{\sum}P_q^2$ in or­der that the sin­gle eigen­func­tion case has the lower value of messi­ness. But macro­scop­i­cally it is known that you can keep in­creas­ing en­tropy in­def­i­nitely by adding more and more heat, and the given ex­pres­sion starts at mi­nus one and never gets above zero.

So try a slightly more gen­eral pos­si­bil­ity, that the en­tropy is the av­er­age of some func­tion of the prob­a­bil­ity, as in $S$ $\vphantom0\raisebox{1.5pt}{$=$}$ ${\sum}P_qf(P_q)$. The ques­tion is then, what func­tion? Well, macro­scop­i­cally it is also known that en­tropy is ad­di­tive, the val­ues of the en­tropies of two sys­tems sim­ply add up. It sim­pli­fies na­ture’s task of main­tain­ing a tight ac­count­ing sys­tem on messi­ness. For two sys­tems with prob­a­bil­i­ties $P_q$ and $P_r$,

\begin{displaymath}
S = \sum_q P_q f(P_q) + \sum_r P_r f(P_r)
\end{displaymath}

This can be rewrit­ten as

\begin{displaymath}
S = \sum_q \sum_r P_q P_r f(P_q) + \sum_q \sum_r P_q P_r f(P_r).
\end{displaymath}

since prob­a­bil­i­ties by them­selves must sum to one. On the other hand, if you com­bine two sys­tems, the prob­a­bil­i­ties mul­ti­ply, just like the prob­a­bil­ity of throw­ing a 3 with your red dice and a 4 with your black dice is $\frac16$ $\times$ $\frac16$. So the com­bined en­tropy should also be equal to

\begin{displaymath}
S = \sum_q\sum_r P_qP_r f(P_qP_r)
\end{displaymath}

Com­par­ing this with the pre­vi­ous equa­tion, you see that $f(P_qP_r)$ must equal $f(P_q)+f(P_r)$. The func­tion that does that is the log­a­rith­mic func­tion. More pre­cisely, you want mi­nus the log­a­rith­mic func­tion, since the log­a­rithm of a small prob­a­bil­ity is a large neg­a­tive num­ber, and you need a large pos­i­tive messi­ness if the prob­a­bil­i­ties are spread out over a large num­ber of states. Also, you will need to throw in a fac­tor to en­sure that the units of the mi­cro­scop­i­cally de­fined en­tropy are the same as the ones in the macro­scop­i­cal de­f­i­n­i­tion. The ap­pro­pri­ate fac­tor turns out to be the Boltz­mann con­stant $k_{\rm B}$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1.380 65 10$\POW9,{-23}$ J/K; note that this fac­tor has ab­solutely no ef­fect on the phys­i­cal mean­ing of en­tropy; it is just a mat­ter of agree­ing on units.

The mi­cro­scopic de­f­i­n­i­tion of en­tropy has been guessed:

\begin{displaymath}
\fbox{$\displaystyle
S = - k_{\rm B}\sum P_q \ln(P_q)
$} %
\end{displaymath} (11.20)

That wasn’t too bad, was it?

At ab­solute zero tem­per­a­ture, the sys­tem is in the ground state. That means that prob­a­bil­ity $P_q$ of the ground state is 1 and all other prob­a­bil­i­ties are zero. Then the en­tropy is zero, be­cause $\ln(1)$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0. The fact that the en­tropy is zero at ab­solute zero is known as the “third law of ther­mo­dy­nam­ics,” {A.35}.

At tem­per­a­tures above ab­solute zero, many eigen­func­tions will have nonzero prob­a­bil­i­ties. That makes the en­tropy pos­i­tive, be­cause log­a­rithms of num­bers less than one are neg­a­tive. (It should be noted that $P_q\ln{P}_q$ be­comes zero when $P_q$ be­comes zero; the blow up of $\ln{P}_q$ is no match for the re­duc­tion in mag­ni­tude of $P_q$. So highly im­prob­a­ble states will not con­tribute sig­nif­i­cantly to the en­tropy de­spite their rel­a­tively large val­ues of the log­a­rithm.)

To put the de­f­i­n­i­tion of en­tropy on a less ab­stract ba­sis, as­sume that you schema­tize the sys­tem of in­ter­est into unim­por­tant eigen­func­tions that you give zero prob­a­bil­ity, and a re­main­ing $N$ im­por­tant eigen­func­tions that all have the same av­er­age prob­a­bil­ity 1/$N$. Sure, it is crude, but it is just to get an idea. In this sim­ple model, the en­tropy is $k_{\rm B}\ln(N)$, pro­por­tional to the log­a­rithm of the num­ber of quan­tum states that have an im­por­tant prob­a­bil­ity. The more states, the higher the en­tropy. This is what you will find in pop­u­lar ex­po­si­tions. And it would ac­tu­ally be cor­rect for sys­tems with zero in­de­ter­mi­nacy in en­ergy, if they ex­isted.

The next step is to check the ex­pres­sion. De­riva­tions are given in {D.60}, but here are the re­sults. For sys­tems in ther­mal equi­lib­rium, is the en­tropy the same as the one given by the clas­si­cal in­te­gra­tion (11.18)? Check. Does the en­tropy ex­ist even for sys­tems that are not in ther­mal equi­lib­rium? Check, quan­tum me­chan­ics still ap­plies. For a sys­tem of given en­ergy, is the en­tropy small­est when the sys­tem is in a sin­gle en­ergy eigen­func­tion? Check, it is zero then. For a sys­tem of given en­ergy, is the en­tropy the largest when all eigen­func­tions of that en­ergy have the same prob­a­bil­ity, as the fun­da­men­tal as­sump­tion of quan­tum sta­tis­tics sug­gests? Check. For a sys­tem with given ex­pec­ta­tion en­ergy but un­cer­tainty in en­ergy, is the en­tropy high­est when the prob­a­bil­i­ties are given by the canon­i­cal prob­a­bil­ity dis­tri­b­u­tion? Check. For two sys­tems in ther­mal con­tact, is the en­tropy great­est when their tem­per­a­tures have be­come equal? Check.

Feyn­man [18, p. 8] gives an ar­gu­ment to show that the en­tropy of an iso­lated sys­tem al­ways in­creases with time. Tak­ing the time de­riv­a­tive of (11.20),

\begin{displaymath}
\frac{{\rm d}S}{{\rm d}t} = -k_{\rm B}\sum_q [\ln(P_q)+1] \...
...d}t}
= -k_{\rm B}\sum_q \sum_r [\ln(P_q)+1] R_{qr} [P_r-P_q],
\end{displaymath}

the fi­nal equal­ity be­ing from time-de­pen­dent per­tur­ba­tion the­ory, with $R_{qr}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $R_{rq}$ $\raisebox{.3pt}{$>$}$ 0 the tran­si­tion rate from state $q$ to state $p$. In the dou­ble sum­ma­tion, a typ­i­cal term with in­dices $q$ and $r$ com­bines with the term hav­ing the re­versed in­dices as

\begin{displaymath}
k_{\rm B}[\ln(P_r) + 1 - \ln(P_q) - 1] R_{qr} [P_r-P_q]
\end{displaymath}

and that is al­ways greater that zero be­cause the terms in the square brack­ets have the same sign: if $P_q$ is greater/less than $P_r$ then so is $\ln(P_q)$ greater/less than $\ln(P_r)$. How­ever, given the de­pen­dence of time-de­pen­dent per­tur­ba­tion the­ory on lin­eariza­tion and worse, the mea­sure­ment wild card, chap­ter 7.6 you might con­sider this more a val­i­da­tion of time de­pen­dent per­tur­ba­tion the­ory than of the ex­pres­sion for en­tropy. Then there is the prob­lem of en­sur­ing that a per­turbed and mea­sured sys­tem is adi­a­batic.

In any case, it may be noted that the checks on the ex­pres­sion for en­tropy, as given above, cut both ways. If you ac­cept the ex­pres­sion for en­tropy, the canon­i­cal prob­a­bil­ity dis­tri­b­u­tion fol­lows. They are con­sis­tent, and in the end, it is just a mat­ter of which of the two pos­tu­lates you are more will­ing to ac­cept as true.