Construction of microcanonical entropy on thermodynamic pillars
Abstract
A question that is currently highly debated is whether the microcanonical entropy should be expressed as the logarithm of the phase volume (volume entropy, also known as the Gibbs entropy) or as the logarithm of the density of states (surface entropy, also known as the Boltzmann entropy). Rather than postulating them and investigating the consequence of each definition, as is customary, here we adopt a bottom-up approach and construct the entropy expression within the microcanonical formalism upon two fundamental thermodynamic pillars: (i) The second law of thermodynamics as formulated for quasi-static processes: is an exact differential, and (ii) the law of ideal gases: . The first pillar implies that entropy must be some function of the phase volume . The second pillar singles out the logarithmic function among all possible functions. Hence the construction leads uniquely to the expression , that is the volume entropy. As a consequence any entropy expression other than that of Gibbs, e.g., the Boltzmann entropy, can lead to inconsistencies with the two thermodynamic pillars. We illustrate this with the prototypical example of a macroscopic collection of non-interacting spins in a magnetic field, and show that the Boltzmann entropy severely fails to predict the magnetization, even in the thermodynamic limit. The uniqueness of the Gibbs entropy, as well as the demonstrated potential harm of the Boltzmann entropy, provide compelling reasons for discarding the latter at once.
pacs:
I Introduction
The recent paper by Dunkel and Hilbert titled “Consistent thermostatistics forbids negative absolute temperatures” Dunkel and Hilbert (2014) has triggered a vigorous debate on whether the Boltzmann entropy (alias the surface entropy, Eq. 1) or the Gibbs entropy (alias the volume entropy, Eq. 2) is the more appropriate expression for the thermodynamic entropy of thermally isolated mechanical systems Sokolov (2014); Vilar and Rubi (2014); Frenkel and Warren (2015); Dunkel and Hilbert (2014a); Schneider et al. (2014); Dunkel and Hilbert (2014b); Hilbert et al. (2014); Swendsen and Wang (2014). The thermodynamic consistency of the Gibbs entropy has been a leitmotiv that sporadically recurred in the classical statistical mechanics literature. It started with Helmholtz Helmholtz (1895), Boltzmann Boltzmann (1909), and Gibbs Gibbs (1902), it continued with P. Hertz Hertz (1910) Einstein Einstein (1911), and others Schlüter (1948); Münster (1969), until it has been reprised recently by various authors Berdichevsky et al. (1991); Pearson et al. (1985); Adib (2004); Campisi (2005); Dunkel and Hilbert (2006). This line of research culminated with the work of Ref. Hilbert et al. (2014), showing that the Gibbs entropy complies with all known thermodynamic laws and unveiling the mistakes apparently incurred into the arguments of its opponents Vilar and Rubi (2014); Frenkel and Warren (2015); Schneider et al. (2014); Swendsen and Wang (2014).
While the work of Ref. Hilbert et al. (2014) is characterised by a top-down approach (namely, one postulates an entropy expression and then investigates compliance with the thermodynamic laws) here we adopt instead a bottom-up approach: we begin from the thermodynamic laws and construct the expression of the microcanonical entropy on them. In particular we base our construction on the following two fundamental pillars of thermodynamics. 1) The second law of thermodynamics as formulated by Clausius for quasi-static processes, namely, , which says that is an integrating factor for , and identifies the entropy with the associated primitive function . 2) The equation of state of an ideal gas .
Our construction, based on the mathematics of differential forms, leads uniquely to the Gibbs entropy; see Sec. III. As a consequence the adoption of any expression of entropy other than the Gibbs entropy, e.g., the Boltzmann entropy, may lead to inconsistency with the fundamental pillars. This will be illustrated with a macroscopic collection of spins in a magnetic field. As we will see the Boltzmann entropy severely fails to predict the correct value of the magnetization, and even predicts a nonexistent phase transition in the thermodynamic limit, see Sec. IV.4. This provides a compelling reason for discarding the Boltzmann entropy at once.
The present work thus complements the work of Ref. Hilbert et al. (2014) by stating not only the compliance of the Gibbs entropy with the thermodynamic laws, but also its necessity and uniqueness: thermodynamic entropy has to be expressed by means of Gibbs formula and no other expression is admissible.
Together with Ref. Hilbert et al. (2014) the present work appears to settle the debated issue.
Ii Definitions
We recall the definitions of Boltzmann and Gibbs entropies within the microcanonical formalism Campisi (2005):
(1) | ||||
(2) |
where
(3) |
denotes the volume of the region of the phase space of the system with energy not above . The symbol stand for some arbitrary constant with units of energy. Here , denotes the Hamilton function of either a classical or a quantum system with degrees of freedom and denotes external parameters, e.g. the volume of a vessel containing the system or the value of an applied magnetic or electric field Hilbert et al. (2014). is their number. In the case of continuous classical systems the symbol Tr stands for an integral over the phase space normalized by the appropriate power of Planck’s constant and possible symmetry factors. For classical discrete systems, Tr denotes a sum over the discrete state space. For quantum systems Tr is the trace over the Hilbert space. The symbol stands for the Heaviside step function. The symbol stands for the density of states, namely the derivative of with respect to :
(4) |
Here it is assumed that the spectrum is so dense that the density of states can be considered a smooth function of .
Iii The construction
The main objective is to link thermodynamic observables, i.e. the forces and temperature , to the quantities which naturally pertain to both the mechanical Hamiltonian description and the thermodynamic description, i.e., the energy and the external parameters . As we will see the entropy, will follow automatically and uniquely once the ’s and are linked.
We begin with the thermodynamic forces , whose expression is universally agreed upon Landau and Lifschitz (1969):
(5) |
with denoting the ensemble average. Within the microcanonical framework these are expressed as:
(6) |
With the expression of we can construct the differential form representing heat:
(7) |
is a differential form in the dimensional space . It is easy to see that, in general is not an exact differential; see, e.g., Ref. Campisi and Kobe (2010).
Before we proceed it is important to explain the meaning of within the microcanonical formalism. The idea behind the microcanonical ensemble is that and are controllable parameters.^{1}^{1}1Similarly and are controllable parameters in the canonical formalism Accordingly, if the system is on an energy surface identified by , the idea is that the experimentalist is able to steer it onto a nearby energy shell . In practice this is can be a difficult task. It can be accomplished, in principle, in the following way: the experimentalist should first change the parameters by in a quasi-static way. This induces a well defined energy change , which is the work done on the system. This brings the system to the energy shell . To bring the system to the target shell the experimentalist must now provide the energy by other means while keeping the fixed. For example she can shine targeted amounts of light on the system, from a light source. After the energy is absorbed by the system (or emitted, depending on its sign), no other interaction occurs and the system continues undisturbed to explore the target shell . In this framework the light source acts as a reservoir of energy, and the quantity , identified as heat, represents the energy it exchanges.
According to the second law of thermodynamics in the formulation given by Clausius the inverse temperature is an integrating factor for . This fundamental statement is often called the heat theorem Gallavotti (1999). We recall that an integrating factor is a function such that equals the total differential of some function called the associated primitive, or in brief, just the primitive. Primitives are determined up to an unimportant constant, which we will disregard in the following. Entropy is defined in thermodynamics as the primitive associated with Clausius’s integrating factor Fermi (1956):
(8) |
In searching for thermodynamically consistent expressions of temperature within the microcanonical formalism, one should therefore look among the integrating factors of the microcanonically calculated heat differential in (7). It must be remarked that it is not obvious that one integrating factor exists, because the existence of integrating factors is not guaranteed in spaces of dimensions higher than . So the existence of a mechanical expression for thermodynamic temperature (hence of the entropy) is likewise not obvious.
It turns out however that an integrating factor for the differential in (7) always exists. Finding it is straightforward if one re-writes the forces in the following equivalent form
(9) |
This follows from the fact that Dirac’s delta is the derivative of Heaviside’s step function. With this, Eq. (7) reads
(10) |
It is now evident that is an integrating factor:
(11) |
being the associated primitive. This does not mean that should be identified with temperature and accordingly with entropy. In fact if an integrating factor exists, this identifies a whole family of infinitely many integrating factors.
To find the family of integrating factors, consider any differentiable function with non null derivative . Its total differential reads:
(12) |
This means that any function of the form
(13) |
is an integrating factor for the heat differential , and is the associated primitive. In fact all integrating factors must be of the form in Eq. (13), which is equivalent to saying that all associated primitives must be of the form
(14) |
To prove that all primitives must be of the form in Eq. (14) we consider the adiabatic manifolds, namely the dimensional manifolds in the space identified by the condition that , i.e., . Note that the density of states is a strictly positive function . This is because increasing the energy results in a strictly larger enclosed volume in the phase space. Thus, the adiabatic manifolds are characterised by the condition , (i.e., any path occurring on them involves no heat exchanges), and each value of identifies one and only one adiabatic manifold. Any primitive associated with an integrating factor stays constant on the adiabatic manifolds: diverges, which we exclude here. Hence the only way by which any primitives and can both be constant on all adiabatic manifolds is that is a function of , as anticipated. unless
Note that this rules out automatically the surface entropy because, in general, the density of states cannot be written as a function of the phase volume . This is clear for example in the case of an ideal monoatomic gas in a vessel of volume , for which and Khinchin (1949); see below.
Our derivation above tells us that the second law requires that the entropy, which is one of the primitives, has to be a function of the phase volume, but does not tell us which function that is. For that we need to identify which, among the infinitely many integrating factors, corresponds to Clausius’s notion of temperature. We remark that once the function is chosen, it has to be one and the same for all systems. This is because by adjusting the external parameters , whose number and physical meaning is completely unspecified, one can transform any Hamiltonian into any other. This fact reflects the very essence of Clausius’s heat theorem, namely, that there exists a unique and universal scale of temperature which is one and the same for all systems Weiss (2006).
We proceed then to single out the function that is consistent with the notion of temperature of an ideal monoatomic gas in a vessel of volume , taking its equation of state as the definition. The Hamilton function of an ideal monoatomic gas reads
(15) |
with representing the box potential confining the gas within the volume . The phase volume reads Khinchin (1949)
(16) |
Hence, using Eq. (9), we obtain for the pressure, : . Confronting this with the ideal gas law we obtain
(17) |
consistently with what is known from thermodynamics. Since
(18) |
in this case, we readily recognize that , namely,
(19) |
That is , which singles out the Gibbs entropy,
(20) |
as the primitive associated with the integrating factor corresponding to the thermodynamic absolute temperature Hilbert et al. (2014).
Iv Discussion
iv.1 Ensemble inequivalence
As mentioned above the density of states is definite positive , also, by definition, the volume is non-negative. Hence their ratio is non-negative. This means that, within the microcanonical formalism, negative temperatures are inadmissible. Often the present microcanonical scenario is confused with the more common canonical scenario, where the system stays in a canonical state at all times during a transformation, e.g., Ref. Schneider et al. (2014). This is unfortunate because, as we see below, microcanonical and canonical descriptions are not equivalent for those finite spectrum systems usually discussed in this context.
The same construction presented above can be repeated for systems obeying statistics other than microcanonical Campisi (2007); Campisi et al. (2009). If applied to the canonical ensemble, , (with being the canonical partition function) the canonical expression
(21) |
for the forces, along with the equation of state of the ideal gas, uniquely identifies the canonical parameter as the integrating factor, and its associated primitive
(22) |
as the only thermodynamically consistent expressions of inverse temperature and entropy within the canonical formalism.^{2}^{2}2Incidentally , that is, the canonical entropy coincides with the Gibbs-von Neumann information of the canonical distribution . In the canonical formalism nothing formally constraints the sign of to be definite. A spin system in a canonical state at negative will have a positive internal energy . The same system in the microcanonical state of energy will, however, have a positive thermodynamic temperature. This evidences the inequivalence of canonical and microcanonical ensembles in systems with a finite spectrum.
iv.2 Exact vs approximate constructions
In an attempt to justify the correctness of the Boltzmann entropy, Frenkel and Warren Frenkel and Warren (2015), provided a construction which leads to the Boltzmann entropy. It must be stressed that the construction presented by Frenkel and Warren Frenkel and Warren (2015) is approximate, and valid only under the assumption that the saddle point approximation holds. This approximation holds only when the density of states increases exponentially with the energy. Under this assumption, however, the density of states and phase volume coincide. So the construction of Frenkel and Warren Frenkel and Warren (2015) cannot shed light onto which entropy expression is appropriate in the case when they do not coincide, which is indeed the very case of practical interest.
In contrast, the present construction is exact, i.e., it holds regardless of the functional dependence of the density of states on energy. Accordingly it says that in any case, independent of whether equivalence of the two entropies holds, the volume entropy is the consistent choice.
iv.3 Thermodynamic temperature equals equipartition temperature
For continuous classical Hamiltonian systems, thanks to the equipartition theorem Khinchin (1949), the thermodynamic temperature is identical with the equipartition temperature :
(23) |
where the average is the microcanonical average on the shell . This provides further evidence that the choice conforms to the common notion of temperature of any classical system, not just the ideal monoatomic gas. We further remark that the equipartition theorem also identifies the temperature in Eq. (19) as an intensive quantity, namely a property that is equally shared by all subsystems Hilbert et al. (2014).
We emphasize that at variance with previous approaches to the foundations of the Gibbs entropy Hertz (1910); Berdichevsky (1997); Campisi (2005); Campisi and Kobe (2010), which postulated that the thermodynamic temperature is the equipartition temperature, here we have instead postulated only that temperature is the integrating factor that is consistent with the ideal gas law and have obtained the coincidence with the equipartition temperature as an aftermath. The advantage of the present approach is evident: it applies to any microcanonical system, even those for which there is no equipartition theorem (e.g., quantum systems).
iv.4 the Boltzmann entropy fails to predict the value of thermodynamic forces
At variance with other approaches we chose as starting point the expression for the microcanonical forces (6) which is universally agreed upon and built our construction on that firm ground. The salient point of our argument is the identity (9) expressing the microcanonical forces in terms of the partial derivatives of . The identity (9) alone has as a consequence that the entropy must be of the form with some with non null derivative . In fact, for any one finds the forces , to be identical to the microcanonical forces , Eq. (6)
(24) |
Here is a shorthand notation for . If one employs an entropy expression that is not of the form , e.g., the Boltzmann entropy, one can well end up in wrongly evaluating the forces.
This happens, for example, in the case of a large collection of non interacting spins in a magnetic field , at energy Dunkel and Hilbert (2014), that is the prototypical example of the emergence of negative Boltzmann temperature Purcell and Pound (1951); Ramsey (1956). The Hamiltonian reads Kubo et al. (1965)
(25) |
Here plays the role of the external parameter , is depending on whether the spin points parallel (up) or antiparallel (down) to the field, and is the magnetic moment of each spin. At energy , the magnetization is given by (6):
(26) |
The number of states with spins up is
(27) |
The number of states with no more than spins up is
(28) |
Using the relation , and treating as a continuous variable under the assumption that is very large, according to standard procedures, we observe that denotes the number of states with energy between and . The density of states is therefore:
(29) |
and the number of states with energy below is
(30) |
Figure 1 shows the Gibbs and Boltzmann temperatures and magnetizations as functions of calculated with
(31) | ||||
(32) |
For larger values of qualitatively similar plots are obtained. A very unphysical property of is that with the flip of a single spin it jumps discontinuously from to in the thermodynamic limit. The usual reply to such a criticism would be, following Ramsey (1956), to say that one should look instead at the quantity , which displays no divergence. No way out is however possible if one considers the magnetization. As can be seen from the figure, only reproduces the exact result, Eq. (26) whereas the magnetization given by is drastically off, and even predicts a nonexistent and unphysical phase transition, in the thermodynamic limit, where the magnetization abruptly jumps from to as a single spin flips from to . The results in the figure are also corroborated by analytical calculations. Using Eqs. (31) and (32) with Eqs. (29) and (30) we obtain
(33) | ||||
(34) |
Thus the discrepancy between the Boltzmann magnetization and the physical magnetization is given by the negative Boltzmann thermal energy rescaled by the applied magnetic field:
(35) |
Since diverges around the zero energy in the thermodynamic limit, so does the discrepancy . Note that the discrepancy also diverges as the intensity of the applied magnetic field decreases. It is interesting to notice that, while in the thermodynamic limit approaches for , the same is not true for , which distinctly deviates from for both and . This unveils the fact, apparently previously unnoticed, that Boltzmann and the Gibbs entropy are not equivalent even in the lower part of the spectrum of large spin systems.
Equation (33) is a special case of a general relation linking the Boltzmann forces () and the Gibbs forces (i.e., the thermodynamic forces ), reading:
(36) |
This equation accompanies a similar relation linking Boltzmann and Gibbs temperatures
(37) |
with being the heat capacity. Equations (36) and (37) follow by taking the derivative with respect to of and respectively.
The reason for the thermodynamic inconsistency of (consistency of ) can also be understood in the following way. Consider the heat differential . Clearly is an integrating factor: . Hence is a primitive. Accordingly the adiabats are determined by the equation
(38) |
and the entropy must be some monotonic function of , that is of . By inspecting Eqs. (29) and (30) we see that the phase volume is a monotonic function of while the density of states is not a function of ; hence is thermodynamically inconsistent.
The inequivalence of and is most clearly seen by plotting the iso- lines in the thermodynamic space ; see Fig. 2. Note that the adiabats, Eq. (38) are straight lines passing through the origin. The iso- lines instead predict a completely different structure of the adiabats. Note in particular that the iso- lines are closed. This evidences their thermodynamical inconsistency.
Summing up: the Boltzmann entropy severely fails to accomplish one of its basic tasks, namely, reproducing the correct value of the thermodynamic forces and of heat.
V Concluding remarks
We have shown that, within the microcanonical formalism there is only one possible choice of entropy that is consistent with the second law and the equation of state of an ideal gas, namely, the Gibbs entropy. Discarding the Gibbs entropy in favour of the Boltzmann entropy, may accordingly result in inconsistency with either of those two pillars. For the great majority of large thermodynamic systems, Gibbs and Boltzmann entropies practically coincide; hence there is no problem regarding which we choose. However, there are cases when the two do not coincide: examples are spin systems Dunkel and Hilbert (2014) and point vortex gases Berdichevsky (1995), where Boltzmann temperature, in disagreement with Gibbs temperature, has no definite sign, and the Boltzmann entropy can largely fail to predict correct values of thermodynamic forces.
It must be stressed that the demonstrated failure of the Boltzmann entropy to reproduce the thermodynamic forces is not restricted to small systems, where the failure was already known to occur Dunkel and Hilbert (2014), but survives, and even becomes more prominent, in the thermodynamic limit, where the Boltzmann entropy predicts an unphysical and nonexistent phase transition in the magnetization of a system of non-interacting spins in a magnetic field.
In the light of the present results, together with the established fact that the Gibbs entropy conforms with all thermodynamic laws Hilbert et al. (2014), the issue of which entropy expression is correct is apparently now fully and ultimately settled.
Acknowledgements
The author is indebted to Jörn Dunkel, Stefan Hilbert, Peter Talkner and especially Peter Hänggi, for the many discussions we had on this topic for years. This research was supported by a Marie Curie Intra European Fellowship within the 7th European Community Framework Programme through the project NeQuFlux Grant No. 623085 and by the COST Action No. MP1209 “Thermodynamics in the quantum regime.”
References
- Dunkel and Hilbert (2014) J. Dunkel and S. Hilbert, Nat. Phys. 10, 67 (2014).
- Sokolov (2014) I. M. Sokolov, Nat. Phys. 10, 7 (2014).
- Vilar and Rubi (2014) J. M. G. Vilar and J. M. Rubi, J. Chem. Phys. 140, 201101 (2014).
- Frenkel and Warren (2015) D. Frenkel and P. B. Warren, Am. J. Phys. 83, 163 (2015).
- Dunkel and Hilbert (2014a) J. Dunkel and S. Hilbert, arXiv:1403.6058 (2014a).
- Schneider et al. (2014) U. Schneider, S. Mandt, A. Rapp, S. Braun, H. Weimer, I. Bloch, and A. Rosch, arXiv:1407.4127 (2014).
- Dunkel and Hilbert (2014b) J. Dunkel and S. Hilbert, arXiv:1408.5392 (2014b).
- Hilbert et al. (2014) S. Hilbert, P. Hänggi, and J. Dunkel, Phys. Rev. E 90, 062116 (2014).
- Swendsen and Wang (2014) R. H. Swendsen and J.-S. Wang, arXiv:1410.4619 (2014).
- Helmholtz (1895) H. Helmholtz, in Wissenschaftliche Abhandlungen, edited by G. Wiedemann (Johann Ambrosius Barth, Leipzig, 1895), vol. 3, pp. 142–162, 163–178, 179–202.
- Boltzmann (1909) L. Boltzmann, in Wissenschaftliche Abhandlungen, edited by F. Hasenöhrl (Johann Ambrosius Barth Verlag, Leipzig, 1909), vol. 3, pp. 122–152.
- Gibbs (1902) J. Gibbs, Elementary Principles in Statistical Mechanics (Yale University Press, New Haven, 1902).
- Hertz (1910) P. Hertz, Ann. Phys. (Leipzig) 338, 225 (1910).
- Einstein (1911) A. Einstein, Ann. Phys. (Leipzig) 34, 175 (1911).
- Schlüter (1948) A. Schlüter, Z. Naturforsch. A 3, 350 (1948).
- Münster (1969) A. Münster, Statistical Thermodynamics (Vol. 1) (Springer, Berlin, 1969).
- Berdichevsky et al. (1991) V. Berdichevsky, I. Kunin, and F. Hussain, Phys. Rev. A 43, 2050 (1991).
- Pearson et al. (1985) E. M. Pearson, T. Halicioglu, and W. A. Tiller, Phys. Rev. A 32, 3030 (1985).
- Adib (2004) A. Adib, J. Stat. Phys. 117, 581 (2004).
- Campisi (2005) M. Campisi, Stud. Hist. Phil. Mod. Phys. 36, 275 (2005).
- Dunkel and Hilbert (2006) J. Dunkel and S. Hilbert, Physica A 370, 390 (2006).
- Landau and Lifschitz (1969) L. Landau and E. Lifschitz, Statistical Physics (Pergamon, Oxford, 1969), 2nd ed.
- Campisi and Kobe (2010) M. Campisi and D. H. Kobe, Am. J. Phys. 78, 608 (2010).
- Gallavotti (1999) G. Gallavotti, Statistical Mechanics: A Short Treatise (Springer, Berlin, 1999).
- Fermi (1956) E. Fermi, Thermodynamics (Dover, New York, 1956).
- Khinchin (1949) A. Khinchin, Mathematical Foundations of Statistical Mechanics (Dover, New York, 1949).
- Weiss (2006) V. C. Weiss, Am. J. Phys. 74 (2006).
- Campisi (2007) M. Campisi, Physica A 385, 501 (2007).
- Campisi et al. (2009) M. Campisi, P. Talkner, and P. Hänggi, Phys. Rev. E 80, 031145 (2009).
- Berdichevsky (1997) V. L. Berdichevsky, Thermodynamics of Chaos and Order (Addison-Wesley / Longman, 1997), harlow, essex ed.
- Purcell and Pound (1951) E. M. Purcell and R. V. Pound, Phys. Rev. 81, 279 (1951).
- Ramsey (1956) N. F. Ramsey, Phys. Rev. 103, 20 (1956).
- Kubo et al. (1965) R. Kubo, H. Ichimura, T. Usui, and N. Hashitsume, Statistical Mechanics (North-Holland, Amsterdam, 1965), 6th ed.
- Berdichevsky (1995) V. L. Berdichevsky, Phys. Rev. E 51, 4432 (1995).