D.60 Checks on the ex­pres­sion for en­tropy

Ac­cord­ing to the mi­cro­scopic de­f­i­n­i­tion, the dif­fer­en­tial of the en­tropy $S$ should be

\begin{displaymath}
{\rm d}S = -k_{\rm B}{\rm d}\left[\sum_q P_q \ln P_q\right]
\end{displaymath}

where the sum is over all sys­tem en­ergy eigen­func­tions $\psi^{\rm S}_q$ and $P_q$ is their prob­a­bil­ity. The dif­fer­en­tial can be sim­pli­fied to

\begin{displaymath}
{\rm d}S = - k_{\rm B}\sum_q \left[\ln P_q + 1\right]{ \rm d}P_q
= - k_{\rm B}\sum_q \ln P_q{ \rm d}P_q,
\end{displaymath}

the lat­ter equal­ity since the sum of the prob­a­bil­i­ties is al­ways one, so $\sum_q{{\rm d}}P_q$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0.

This is to be com­pared with the macro­scopic dif­fer­en­tial for the en­tropy. Since the macro­scopic ex­pres­sion re­quires ther­mal equi­lib­rium, $P_q$ in the mi­cro­scopic ex­pres­sion above can be equated to the canon­i­cal value $e^{-{\vphantom' E}^{\rm S}_q/{k_{\rm B}}T}$$\raisebox{.5pt}{$/$}$$Z$ where ${\vphantom' E}^{\rm S}_q$ is the en­ergy of sys­tem eigen­func­tion $\psi^{\rm S}_q$. It sim­pli­fies the mi­cro­scopic dif­fer­en­tial of the en­tropy to

\begin{displaymath}
{\rm d}S
= - k_{\rm B}\sum_q \left[-\frac{{\vphantom' E}^{...
...
= \frac{1}{T} \sum_q {\vphantom' E}^{\rm S}_q{ \rm d}P_q, %
\end{displaymath} (D.38)

the sec­ond in­equal­ity since $Z$ is a con­stant in the sum­ma­tion and $\sum_q{{\rm d}}P_q$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0.

The macro­scopic ex­pres­sion for the dif­fer­en­tial of en­tropy is given by (11.18),

\begin{displaymath}
{\rm d}S = \frac{\delta Q}{T}.
\end{displaymath}

Sub­sti­tut­ing in the dif­fer­en­tial first law (11.11),

\begin{displaymath}
{\rm d}S = \frac{1}{T}{ \rm d}E + \frac{1}{T}P { \rm d}V
\end{displaymath}

and plug­ging into that the de­f­i­n­i­tions of $E$ and $P$,

\begin{displaymath}
{\rm d}S = \frac{1}{T} { \rm d}\left[\sum_qP_q {\vphantom'...
...c{{\rm d}{\vphantom' E}^{\rm S}_q}{{\rm d}V}\right] { \rm d}V
\end{displaymath}

and dif­fer­en­ti­at­ing out the prod­uct in the first term, one part drops out ver­sus the sec­ond term and what is left is the dif­fer­en­tial for $S$ ac­cord­ing to the mi­cro­scopic de­f­i­n­i­tion (D.38). So, the macro­scopic and mi­cro­scopic de­f­i­n­i­tions agree to within a con­stant on the en­tropy. That means that they agree com­pletely, be­cause the macro­scopic de­f­i­n­i­tion has no clue about the con­stant.

Now con­sider the case of a sys­tem with zero in­de­ter­mi­nacy in en­ergy. Ac­cord­ing to the fun­da­men­tal as­sump­tion, all the eigen­func­tions with the cor­rect en­ergy should have the same prob­a­bil­ity in ther­mal equi­lib­rium. From the en­tropy’s point of view, ther­mal equi­lib­rium should be the sta­ble most messy state, hav­ing the max­i­mum en­tropy. For the two views to agree, the max­i­mum of the mi­cro­scopic ex­pres­sion for the en­tropy should oc­cur when all eigen­func­tions of the given en­ergy have the same prob­a­bil­ity. Re­strict­ing at­ten­tion to only the en­ergy eigen­func­tions $\psi^{\rm S}_q$ with the cor­rect en­ergy, the max­i­mum en­tropy oc­curs when the de­riv­a­tives of

\begin{displaymath}
F = - k_{\rm B}\sum_q P_q \ln P_q -\epsilon \left(\sum_q P_q -1\right)
\end{displaymath}

with re­spect to the $P_q$ are zero. Note that the con­straint that the sum of the prob­a­bil­i­ties must be one has been added as a penalty term with a La­grangian mul­ti­plier, {D.48}. Tak­ing de­riv­a­tives pro­duces

\begin{displaymath}
- k_{\rm B}\ln(P_q) -k_{\rm B}- \epsilon = 0
\end{displaymath}

show­ing that, yes, all the $P_q$ have the same value at the max­i­mum en­tropy. (Note that the min­ima in en­tropy, all $P_q$ zero ex­cept one, do not show up in the de­riva­tion; $P_q\ln{P}_q$ is zero when $P_q$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0, but its de­riv­a­tive does not ex­ist there. In fact, the in­fi­nite de­riv­a­tive can be used to ver­ify that no max­ima ex­ist with any of the $P_q$ equal to zero if you are wor­ried about that.)

If the en­ergy is un­cer­tain, and only the ex­pec­ta­tion en­ergy is known, the pe­nal­ized func­tion be­comes

\begin{displaymath}
F = - k_{\rm B}\sum_q P_q \ln P_q
- \epsilon_1 \left(\sum ...
... \epsilon_2 \left(\sum {\vphantom' E}^{\rm S}_q P_q - E\right)
\end{displaymath}

and the de­riv­a­tives be­come

\begin{displaymath}
- k_{\rm B}\ln(P_q) -k_{\rm B}- \epsilon_1 -\epsilon_2 {\vphantom' E}^{\rm S}_q = 0
\end{displaymath}

which can be solved to show that

\begin{displaymath}
P_q = C_1 e^{-{\vphantom' E}^{\rm S}_q/C_2}
\end{displaymath}

with $C_1$ and $C_2$ con­stants. The re­quire­ment to con­form with the given de­f­i­n­i­tion of tem­per­a­ture iden­ti­fies $C_2$ as ${k_{\rm B}}T$ and the fact that the prob­a­bil­i­ties must sum to one iden­ti­fies $C_1$ as 1/$Z$.

For two sys­tems $A$ and $B$ in ther­mal con­tact, the prob­a­bil­i­ties of the com­bined sys­tem en­ergy eigen­func­tions are found as the prod­ucts of the prob­a­bil­i­ties of those of the in­di­vid­ual sys­tems. The max­i­mum of the com­bined en­tropy, con­strained by the given to­tal en­ergy $E$, is then found by dif­fer­en­ti­at­ing

\begin{eqnarray*}
F &=& - k_{\rm B}\sum_{q_A}\sum_{q_B} P_{q_A}P_{q_B} \ln(P_{q...
... S}_{q_A} + \sum_{q_B} P_{q_B} {\vphantom' E}^{\rm S}_{q_B} - E)
\end{eqnarray*}

$F$ can be sim­pli­fied by tak­ing apart the log­a­rithm and not­ing that the prob­a­bil­i­ties $P_{q_A}$ and $P_{q_B}$ sum to one to give

\begin{eqnarray*}
F &=&
- k_{\rm B}\sum_{q_A} P_{q_A} \ln(P_{q_A})
- k_{\rm ...
... S}_{q_A} + \sum_{q_B} P_{q_B} {\vphantom' E}^{\rm S}_{q_B} - E)
\end{eqnarray*}

Dif­fer­en­ti­a­tion now pro­duces

\begin{eqnarray*}
- k_{\rm B}\ln(P_{q_A}) - k_{\rm B}-\epsilon_{1,A} - \epsilon...
... B}-\epsilon_{1,B} - \epsilon_2 {\vphantom' E}^{\rm S}_{q_B} = 0
\end{eqnarray*}

which pro­duces $P_{q_A}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $C_{1,A}e^{-{\vphantom' E}^{\rm S}_{q_A}/C_2}$ and $P_{q_B}$ $\vphantom0\raisebox{1.5pt}{$=$}$ $C_{1,B}e^{-{\vphantom' E}^{\rm S}_{q_B}/C_2}$ and the com­mon con­stant $C_2$ then im­plies that the two sys­tems have the same tem­per­a­ture.