11.15 Spe­cific Heats

The spe­cific heat of a sub­stance de­scribes its ab­sorp­tion of heat in terms of its tem­per­a­ture change. In par­tic­u­lar, the spe­cific heat at con­stant vol­ume, $C_v$, of a sub­stance is the ther­mal en­ergy that gets stored in­ter­nally in the sub­stance per unit tem­per­a­ture rise and per unit amount of sub­stance.

As a first ex­am­ple, con­sider sim­ple monatomic ideal gases, and in par­tic­u­lar no­ble gases. Ba­sic physics, or sec­tion 11.14.4, shows that for an ideal gas, the mol­e­cules have $\frac12{k_{\rm B}}T$ of trans­la­tional ki­netic en­ergy in each of the three di­rec­tions of a Carte­sian co­or­di­nate sys­tem, where $k_{\rm B}$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1.38 10$\POW9,{-23}$ J/K is Boltz­mann’s con­stant. So the spe­cific heat per mol­e­cule is $\frac32{k_{\rm B}}$ or $1.5{k_{\rm B}}$. For a kmol (i.e. 6.02 10$\POW9,{26}$) of mol­e­cules in­stead of one, $k_{\rm B}$ be­comes the “uni­ver­sal gas con­stant” $R_{\rm {u}}$ $\vphantom0\raisebox{1.5pt}{$=$}$ 8.31 kJ/kmol K. Hence for a

\begin{displaymath}
\mbox{monatomic ideal gas: }
\bar{C}_v = 1.5 R_{\rm {u}} \approx 12.5\mbox{ kJ/kmol K}
\end{displaymath} (11.63)

on a kmol ba­sis. As fig­ure 11.15 shows, this is very ac­cu­rate for no­ble gases, in­clud­ing he­lium. (To get the more usual spe­cific heat $C_v$ per kilo­gram in­stead of kmol, di­vide by the mo­lar mass $M$. For ex­am­ple, for he­lium with two pro­tons and two neu­trons in its nu­cleus, the mo­lar mass is about 4 kg/kmol, so di­vide by 4. In thermo books, you will prob­a­bly find the mo­lar mass val­ues you need mis­listed as mol­e­c­u­lar mass, with­out units. Just use the val­ues and ig­nore the name and the miss­ing units of kg/kmol. See the no­ta­tions for more.)

Many im­por­tant ideal gases, such as hy­dro­gen, as well as the oxy­gen and ni­tro­gen that make up air, are di­atomic. Now if we as­sume that the two atoms are point-size masses some­how rigidly con­nected to each other, we still have that the cen­ter of the en­tire mol­e­cule can move in three dif­fer­ent di­rec­tions, ac­count­ing for $\frac32{k_{\rm B}}$ of ki­netic trans­la­tional en­ergy. But at any given time, the mol­e­cule can also be con­duct­ing ro­ta­tional mo­tion around its cen­ter in two in­de­pen­dent di­rec­tions, both or­thog­o­nal to the con­nect­ing line be­tween the atoms. (For point masses, ro­ta­tion around the con­nect­ing axis would not do any­thing.) Clas­si­cal physics, in par­tic­u­lar the “equipar­ti­tion the­o­rem,” would then pre­dict that each of the two ro­ta­tional mo­tions has $\frac12{k_{\rm B}}$ of ki­netic en­ergy too, rais­ing the to­tal spe­cific heat to $\frac52{k_{\rm B}}$ or $2.5{k_{\rm B}}$. Well, fig­ure 11.15 shows that at room tem­per­a­ture, about 300 K, this is quite ac­cu­rate for com­mon di­atomic gases like the ni­tro­gen and oxy­gen in air.

But note that these ex­per­i­men­tal data show that there are prob­lems, both at very low tem­per­a­tures, and at very high ones. And there are ma­jor the­o­ret­i­cal prob­lems too. Surely the con­nec­tion be­tween the atoms is not go­ing to be in­fi­nitely rigid. Al­low­ing for that, we now have two in­di­vid­ual atoms that can each move in three dif­fer­ent di­rec­tions in­de­pen­dently of each other. That raises the ki­netic en­ergy to $\frac62{k_{\rm B}}$. And as­sum­ing that the con­nect­ing force varies lin­early with elon­ga­tion, there would be an­other $\frac12{k_{\rm B}}$ of po­ten­tial en­ergy, mak­ing the to­tal en­ergy $\frac72{k_{\rm B}}$. But fig­ure 11.15 shows that the com­mon di­atomic gasses only ap­proach val­ues like that well above room tem­per­a­ture.

It was all a big prob­lem for clas­si­cal physics. Not to men­tion that, as Maxwell noted, if you re­ally take clas­si­cal the­ory at face value, things get far, far, worse still, since the in­di­vid­ual in­ter­nal parts of the atoms, like the in­di­vid­ual elec­trons and quarks in the nu­clei, would each have to ab­sorb their own ther­mal en­ergy too. This should pro­duce enor­mously high spe­cific heats.

Fig­ure 11.15: Spe­cific heat at con­stant vol­ume of gases. Tem­per­a­tures from ab­solute zero to 1,200 K. Data from NIST-JANAF and AIP.
\begin{figure}\centering
\setlength{\unitlength}{1pt}
\begin{picture}(405,18...
...
\put(15,57){\makebox(0,0){He, Ne, Ar, Kr, \ldots}}
\end{picture}
\end{figure}

Hy­dro­gen in par­tic­u­lar was a mys­tery be­fore the ad­vent of quan­tum me­chan­ics: at low tem­per­a­tures it would be­have as a monatomic gas, with a spe­cific heat of $\frac32{k_{\rm B}}$ per mol­e­cule, fig­ure 11.15. That meant that the mol­e­cule had to be trans­lat­ing only, like a monatomic gas. How could the ran­dom ther­mal mo­tion not cause any an­gu­lar ro­ta­tion of the two atoms around their mu­tual cen­ter of grav­ity, nor vi­bra­tion of the atoms to­wards and away from each other?

Quan­tum me­chan­ics solved this prob­lem. In quan­tum me­chan­ics the an­gu­lar mo­men­tum of the mol­e­cule, and so the cor­re­spond­ing ki­netic en­ergy, as well as the har­monic os­cil­la­tion en­ergy, are quan­tized. For hy­dro­gen at low tem­per­a­tures, the typ­i­cal avail­able ther­mal en­ergy $\frac12{k_{\rm B}}T$ is not enough to reach even the first level above the ground state for ei­ther en­ergy. No ther­mal en­ergy can there­fore be put into ro­ta­tion of the mol­e­cule, nor into in­ter­nal vi­bra­tion. So hy­dro­gen does in­deed have the spe­cific heat of monatomic gases at low tem­per­a­tures, weird as it may seem. The ro­ta­tional and vi­bra­tional mo­tions are frozen out.

At nor­mal tem­per­a­tures, there is enough ther­mal en­ergy to reach the states where the mol­e­cule ro­tates nor­mal to the line con­nect­ing the atoms, and the spe­cific heat be­comes

\begin{displaymath}
\mbox{typical diatomic ideal gas: }
\bar{C}_v = 2.5 R_{\rm {u}} \approx 20.8 \mbox{ kJ/kmol K}.
\end{displaymath} (11.64)

Ac­tual val­ues for hy­dro­gen, ni­tro­gen and oxy­gen at room tem­per­a­ture are 2.47, 2.50, and 2.53 $R_{\rm {u}}$.

For high enough tem­per­a­ture, the vi­bra­tional modes will start be­com­ing ac­tive, and the spe­cific heats will start inch­ing up to­wards 3.5 $R_{\rm {u}}$ (and be­yond), fig­ure 11.15. But it takes to tem­per­a­tures of 1 000 K (hy­dro­gen), 600 K (ni­tro­gen), or 400 K (oxy­gen) be­fore there is a 5% de­vi­a­tion from the 2.5 $R_{\rm {u}}$ value.

These dif­fer­ences may be un­der­stood qual­i­ta­tively if the mo­tion is mod­eled as a sim­ple har­monic os­cil­la­tor as dis­cussed in chap­ter 4.1. The en­ergy lev­els of an har­monic os­cil­la­tor are apart by an amount $\hbar\omega$, where $\omega$ is the an­gu­lar fre­quency. And the fre­quency of a har­monic os­cil­la­tor $\omega$ $\vphantom0\raisebox{1.5pt}{$=$}$ $\sqrt{c/m}$, where $c$ is the ef­fec­tive stiff­ness and $m$ the ef­fec­tive mass of the vi­bra­tional mo­tion. So light atoms that are bound to­gether tightly will re­quire a lot of ther­mal en­ergy to reach the first non­triv­ial vi­bra­tional state. Hy­dro­gen is much lighter than ni­tro­gen or oxy­gen, so the re­quired en­ergy $\hbar\omega$ should be quite large. This ex­plains the high tem­per­a­ture be­fore vi­bra­tion be­come im­por­tant for hy­dro­gen. The mo­lar masses of ni­tro­gen and oxy­gen are sim­i­lar, but ni­tro­gen is bound with a triple bond, and oxy­gen only a dou­ble one. So ni­tro­gen has the higher ef­fec­tive stiff­ness of the two and vi­brates less read­ily.

Fol­low­ing this rea­son­ing, you would ex­pect flu­o­rine, which is held to­gether with only a sin­gle co­va­lent bond, to have a higher spe­cific heat still, and fig­ure 11.15 con­firms it. And chlo­rine and bromine, also held to­gether by a sin­gle co­va­lent bond, but heav­ier than flu­o­rine, ap­proach the clas­si­cal value 3.5 $R_{\rm {u}}$ fairly closely at nor­mal tem­per­a­tures: Cl$_2$ has 3.08 $R_{\rm {u}}$ and Br$_2$ 3.34 $R_{\rm {u}}$.

For solids, the ba­sic clas­si­cal idea in terms of atomic mo­tion would be that there would be $\frac32R_{\rm {u}}$ per atom in ki­netic en­ergy and $\frac32R_{\rm {u}}$ in po­ten­tial en­ergy:

\begin{displaymath}
\mbox{law of Dulong and Petit: }
\bar{C}_v = 3 R_{\rm {u}} \approx 25 \mbox{ kJ/kmol K}.
\end{displaymath} (11.65)

Not only is 3 a nice round num­ber, it ac­tu­ally works well for a lot of rel­a­tively sim­ple solids at room tem­per­a­ture. For ex­am­ple, alu­minum is 2.91 $R_{\rm {u}}$, cop­per 2.94, gold 3.05, iron 3.02.

Note that typ­i­cally for solids $\bar{C}_p$, the heat added per unit tem­per­a­ture change at con­stant pres­sure is given in­stead of $\bar{C}_v$. How­ever, un­like for gases, the dif­fer­ence be­tween $\bar{C}_p$ and $\bar{C}_v$ is small for solids and most liq­uids and will be ig­nored here.

Du­long and Pe­tit also works for liq­uid wa­ter if you take it per kmol of atoms, rather than kmol of mol­e­cules, but not for ice. Ice has 4.6 $R_{\rm {u}}$ per kmol of mol­e­cules and 1.5 $R_{\rm {u}}$ per kmol of atoms. For mol­e­cules, cer­tainly there is an ob­vi­ous prob­lem in de­cid­ing how many pieces you need to count as in­de­pen­dently mov­ing units. A value of 900 $R_{\rm {u}}$ for paraf­fin wax (per mol­e­cule) found at Wikipedia may sound as­ton­ish­ing, un­til you find else­where at Wikipedia that its chem­i­cal for­mula is C$_{25}$H$_{52}$. It is still quite ca­pa­ble of stor­ing a lot of heat per unit weight too, in any case, but nowhere close to hy­dro­gen. Putting $\frac52{k_{\rm B}}T$ in a mol­e­cule with the tiny mol­e­c­u­lar mass of just about two pro­tons is the real way to get a high heat con­tent per unit mass.

Com­plex mol­e­cules may be an un­der­stand­able prob­lem for the law of Du­long and Pe­tit, but how come that di­a­mond has about 0.73 $R_{\rm {u}}$, and graphite 1.02 $R_{\rm {u}}$, in­stead of 3 as it should? No mol­e­cules are in­volved there. The val­ues of boron at 1.33 $R_{\rm {u}}$ and beryl­lium at 1.98 $R_{\rm {u}}$ are much too low too, though not as bad as di­a­mond or graphite.

Fig­ure 11.16: Spe­cific heat at con­stant pres­sure of solids. Tem­per­a­tures from ab­solute zero to 1,200 K. Car­bon is di­a­mond; graphite is sim­i­lar. Wa­ter is ice and liq­uid. Data from NIST-JANAF, CRC, AIP, Rohsenow et al.
\begin{figure}\centering
\setlength{\unitlength}{1pt}
\begin{picture}(405,19...
...0,0)[l]{Pb}}
\put(177,17){\makebox(0,0)[l]{H$_2$O}}
\end{picture}
\end{figure}

Ac­tu­ally, it turns out, fig­ure 11.16, that at much higher tem­per­a­tures di­a­mond does agree nicely with the Du­long and Pe­tit value. Con­versely, if the el­e­ments that agree well with Du­long and Pe­tit at room tem­per­a­ture are cooled to low tem­per­a­tures, they too have a spe­cific heat that is much lower than the Du­long and Pe­tit value. For ex­am­ple, at 77 K, alu­minum has 1.09 $R_{\rm {u}}$, cop­per 1.5, and di­a­mond 0.01.

It turns out that for all of them a char­ac­ter­is­tic tem­per­a­ture can by found above which the spe­cific heat is about the Du­long and Pe­tit value, but be­low which the spe­cific heat starts drop­ping pre­car­i­ously. This char­ac­ter­is­tic tem­per­a­ture is called the De­bye tem­per­a­ture. For ex­am­ple, alu­minum, cop­per, gold, and iron have De­bye tem­per­a­tures of 394, 315, 170, and 460 K, all near or be­low room tem­per­a­ture, and their room tem­per­a­ture spe­cific heats agree rea­son­ably with the Du­long and Pe­tit value. Con­versely, di­a­mond, boron, and beryl­lium have De­bye tem­per­a­tures of 1 860, 1 250, and 1 000 K, and their spe­cific heats are much too low at room tem­per­a­ture.

The lack of heat ca­pac­ity be­low the De­bye tem­per­a­ture is again a mat­ter of frozen out vi­bra­tional modes, like the freez­ing out of the vi­bra­tional modes that gave com­mon di­atomic ideal gases a heat ca­pac­ity of only $\frac52R_{\rm {u}}$ in­stead of $\frac72R_{\rm {u}}$. Note for ex­am­ple that car­bon, boron and beryl­lium are light atoms, and that the di­a­mond struc­ture is par­tic­u­larly stiff, just the prop­er­ties that froze out the vi­bra­tional modes in di­atomic gas mol­e­cules too. How­ever, the ac­tual de­scrip­tion is more com­plex than for a gas: if all vi­bra­tions were frozen out in a solid, there would be noth­ing left.

Atoms in a solid can­not be con­sid­ered in­de­pen­dent har­monic os­cil­la­tors like the pairs of atoms in di­atomic mol­e­cules. If an atom in a solid moves, its neigh­bors are af­fected. The proper way to de­scribe the mo­tion of the atoms is in terms of crys­tal-wide vi­bra­tions, such as those that in nor­mal con­tin­uum me­chan­ics de­scribe acousti­cal waves. There are three vari­ants of such waves, cor­re­spond­ing to the three in­de­pen­dent di­rec­tions the mo­tion of the atoms can take with re­spect to the prop­a­ga­tion di­rec­tion of the wave. The atoms can move in the same di­rec­tion, like in the acoustics of air in a pipe, or in a di­rec­tion nor­mal to it, like sur­face waves in wa­ter. Those are called lon­gi­tu­di­nal and trans­verse waves re­spec­tively. If there is more than one atom in the ba­sis from which the solid crys­tal is formed, the atoms in a ba­sis can also vi­brate rel­a­tive to each other’s po­si­tion in high-fre­quency vi­bra­tions called op­ti­cal modes. How­ever, af­ter such de­tails are ac­counted for, the clas­si­cal in­ter­nal en­ergy of a solid is still the Du­long and Pe­tit value.

En­ter quan­tum me­chan­ics. Just like quan­tum me­chan­ics says that the en­ergy of vi­brat­ing elec­tro­mag­netic fields of fre­quency $\omega$ comes in dis­crete units called pho­tons, with en­ergy $\hbar\omega$, it says that the en­ergy of crys­tal vi­bra­tions comes in dis­crete units called phonons with en­ergy $\hbar\omega$. As long as the typ­i­cal amount of heat en­ergy, ${k_{\rm B}}T$, is larger than the largest of such phonon en­er­gies, the fact that the en­ergy lev­els are dis­crete make no real dif­fer­ence, and clas­si­cal analy­sis works fine. But for lower tem­per­a­tures, there is not enough en­ergy to cre­ate the high-en­ergy phonons and the spe­cific heat will be less. The rep­re­sen­ta­tive tem­per­a­ture $T_D$ at which the heat en­ergy ${k_{\rm B}}T_D$ be­comes equal to the high­est phonon en­er­gies $\hbar\omega$ is the De­bye tem­per­a­ture. (The De­bye analy­sis is not ex­act ex­cept for low en­er­gies, and the de­f­i­n­i­tions of De­bye tem­per­a­ture vary some­what. See sec­tion 11.14.6 for more de­tails.)

Quan­tum me­chan­ics did not just solve the low tem­per­a­ture prob­lems for heat ca­pac­ity; it also solved the elec­tron prob­lem. That prob­lem was that clas­si­cally elec­trons in at least met­als too should have $\frac32{k_{\rm B}}T$ of ki­netic en­ergy, since elec­tri­cal con­duc­tion meant that they moved in­de­pen­dently of the atoms. But ob­ser­va­tions showed it was sim­ply not there. The quan­tum me­chan­i­cal ex­pla­na­tion was the Fermi-Dirac dis­tri­b­u­tion of fig­ure 6.11: only a small frac­tion of the elec­trons have free en­ergy states above them within a dis­tance of or­der ${k_{\rm B}}T$, and only these can take on heat en­ergy. Since so few elec­trons are in­volved, the amount of en­ergy they ab­sorb is neg­li­gi­ble ex­cept at very low tem­per­a­tures. At very low tem­per­a­tures, the en­ergy in the phonons be­comes very small, and the con­duc­tion elec­trons in met­als then do make a dif­fer­ence.

Also, when the heat ca­pac­ity due to the atom vi­bra­tions lev­els off to the Du­long and Pe­tit value, that of the va­lence elec­trons keeps grow­ing. Fur­ther­more, at higher tem­per­a­tures the in­creased vi­bra­tions lead to in­creased de­vi­a­tions in po­ten­tial from the har­monic os­cil­la­tor re­la­tion­ship. Wikipedia, De­bye model, says an­har­monic­ity causes the heat ca­pac­ity to rise fur­ther; ap­par­ently au­thor­i­ta­tive other sources say that it can ei­ther in­crease or de­crease the heat ca­pac­ity. In any case, typ­i­cal solids do show an in­crease of the heat ca­pac­ity above the Du­long and Pe­tit value at higher tem­per­a­tures, fig­ure 11.16.