Appendix B
A Brief Introduction to Statistical Thermodynamics

Statistical mechanics describes the behavior of macroscopic systems in terms of microscopic properties, i.e., those of particles such as atoms, molecules, ions, etc. That part of statistical mechanics that deals with equilibrium states is called statistical thermodynamics; this appendix gives a short introduction whose primary purpose is to provide the working equations of statistical thermodynamics. More detailed discussions are available in many textbooks. Some of these are listed at the end of this appendix. The summary given here is similar to the discussions found in Reed and Gubbins (1973) and Everdell (1975).

Thermodynamic States and Quantum States of a System

Thermodynainic specification of a macroscopic system (e.g., the statement that a system has fixed values of energy, volume, and composition) provides only a partial, in-complete description from a molecular point of view. For example, in a pure crystal, at fixed temperature and volume, many distinguishable arrangements of the atoms on the lattice may correspond to the same thermodynamic state.

According to wave mechanics, the most complete description that is possible about a system is a statement of its wave function, the quantity ψ that appears in Schrodinger’s equation. When ψ is known as a function of the coordinates of the elementary particles, we have a specification of the quantum state of the system. For a macroscopic system (≈ 1024 electrons and nuclei), many of these quantum states, in-deed astronomically many, may all be compatible with the same total energy, volume, and composition.

When measuring a macroscopic property X (e.g., pressure, density, etc.), the value obtained results from the chaotic motions and collisions of a large number of molecules; when viewed over a short time scale (e.g., 10-8 s), property X is a fluctuating quantity. In practice, however, the time required for a macroscopic measurement is usually so much longer than 10-8 s that fluctuations are not observed. In other words, the macroscopic properties are time averages over the very large number of possible quantum states that a system may assume, even though each quantum state is compatible with the macroscopically observed values.

The object of statistical thermodynamics is to calculate these time averages as a function of molecular properties.

Ensembles and Basic Postulates

To calculate time averages over all possible quantum states, some postulates are necessary. To give an exact formulation of the basic postulates, it is useful to define an ensemble, which is a large number of imagined systems. In the ensemble, each system has the same macroscopic properties as those chosen to describe the thermodynamic state of some real system in which we are interested. Although the single systems of the ensemble all have the same macroscopic properties, they may have different quantum states.

If, for example, the total energy of a real system is E, the volume is V, and the number of molecules is N, then every system in the ensemble also has energy E, volume V, and N molecules. Or, for another example, suppose that a real system, having N molecules in volume V with heat-conducting walls, is immersed in a large heat bath. In that event, each system of the ensemble, containing N molecules in volume V with heat-conducting walls, is also immersed in the same large heat bath.

These two examples provide the most frequently encountered types of systems in chemical thermodynamics. The first one is an isolated system (constant N, V, and E) and the second is a closed isothermal system (constant N, V, and T). The corresponding ensembles are called the microcanonical and canonical ensemble, respectively.

Having briefly explained what we mean by ensemble, it is now possible to formulate the first postulate of statistical mechanics:

The time average of a dynamic1 property of a real system is equal to the ensemble average of that property.

1 A dynamic property (e.g. pressure) is one that fluctuates in time, in contrast to a static property (e.g. mass), that is constant in time.

To calculate the ensemble average, it is necessary to know the probabilities of the different quantum states of the systems of the ensemble. These probabilities are given by the second postulate of statistical mechanics:

All accessible and distinguishable quantum states of a closed system of fixed energy (microcanonical ensemble) are equally probable.

These postulates are expressed by the following equations:

(B-1)

Image

and

(B-2)

Image

where X is the measured macroscopic dynamic property of the real system and Xi is the value of this property in that system of the ensemble that is in quantum state i. Probability Pi is the probability of quantum state i of the systems of the ensemble, normalized such that Image The notation ∑i indicates summation over all possible quantum states.

To facilitate understanding of the terms “ensemble” and “ensemble average,” consider the following example. In a box (which is the real system), there are six spheres of equal size; one is white, two are red, and three are blue. Without looking into the box, we take out one sphere, note its color, and put it back again. If we do that often enough, e.g., 1000 times, we find that the total numbers of white, red, and blue spheres taken out are in the ratio 1: 2: 3, respectively. Now let us imagine that we have 1000 of these boxes, each containing six spheres as described above; this set of 1000 boxes is the ensemble. We take one sphere out of each box and note the color of the sphere. Again, the ratio of total numbers of white, red, and blue spheres is 1: 2: 3.

This example illustrates that the result of an experiment often repeated with one box is identical to that obtained with only one experiment with an ensemble of many boxes (systems). In other words, we believe that the time average is equal to the ensemble average. This belief is called the ergodic hypothesis.

The Canonical Ensemble

The canonical ensemble corresponds to a large number of closed systems, each of fixed volume and fixed number of molecules, which are immersed in a large heat bath. To calculate the ensemble average, we must know the distribution of quantum states, i.e., the probability that any one system of the canonical ensemble is in a particular quantum state. To perform the calculation, we visualize that each of the K systems of the canonical ensemble is a cell of volume V that contains N molecules. All cells are in thermal contact but the ensemble itself is thermally isolated, as illustrated in Fig. B-l.

Figure B-1 Canonical ensemble. There are K systems. Each one is in thermal con-tact with the K- 1 other systems permitting exchange of energy.

Image

The canonical ensemble is an isolated system of volume KV with KN molecules and a total energy Et. The K systems may be in different energy states and can ex-change energy with each other. Each system is in contact with a large heat bath formed by the (K-1) other systems.

If n1, n2,…, ni,… systems are in quantum states 1, 2,…, i,… with energy eigenvalues E1, E2,…, Ei…, respectively, then the values n1, n2,…, ni,…determine the distribution of quantum states in the canonical ensemble. There are many distributions that satisfy the relations:

(B-3)

Image

(B-4)

Image

Let n stand for a particular distribution of quantum states having n1 systems in quantum state 1, n2 systems in quantum state 2, etc. However, a large number of states Ω(n) of the canonical ensemble is compatible with this particular distribution. Ω(n) is the number that we have to calculate.

We can reformulate the problem in combinatorial terms; we want to find the number of possible, different arrangements for a total of n elements, from which n1 n2,… elements are not distinguishable. The result is (see App. B-1)

(B-5)

Image

The probability that for a given distribution n, a particular system is in quantum state i, is given by

(B-6)

Image

Index n indicates that Eq. (B-6) holds for a given distribution n. However, a large number of distributions is compatible with Eqs. (B-3) and (B-4) and therefore the probability given by Eq. (B-6) is not sufficient. We want a probability averaged over all possible distributions. As the total ensemble is an isolated system of fixed energy, all accessible and distinguishable quantum states of the system are equally probable. It follows that the statistical weight of a particular distribution n is proportional to Ω(n). The averaged probability <pi> is then given by

(B-7)

Image

where K is the total number of systems in the canonical ensemble; ni,(n) is the number of systems of distribution n that are in quantum state i with energy eigenvalue Ei; Ω(n) is the number of states of the canonical ensemble of distribution n; and Σn indicates summation over all distributions that are compatible with the boundary conditions Eqs. (B-3) and (B-4).

In principle, Eq. (B-7) allows calculation of any dynamic property of a closed system with fixed volume V and fixed number of molecules N. For practical purposes, however, we want to avoid carrying out the summations. Fortunately, as the number of systems in the ensemble becomes very large (K → ∞), the maximum-term method (see App. B-2) can be used and the summation can be replaced by the most probable distribution, i.e., that distribution in which the number of quantum states in the canonical ensemble has its maximum value. The average probability in Eq. (B-7) can be replaced by its most probable value Image:

(B-8)

Image

where ni,(n*) is the number of systems of distribution n* that are in quantum state i with energy eigenvalue Ei and where Ω(n*) is the number of states of the canonical ensemble in the most probable distribution.

To calculate Ω(n*), the maximum value of Ω(n), Eq. (B-5) is written as

(B-9)

Image

For the maximum condition [note that In (K!) = constant],

(B-10)

Image

Using Stirling’s formula (see App. B.3), we obtain

(B-11)

Image

As K and Et are constants, Eqs. (B-3) and (B-4) can be written as

(B-12)

Image

(B-13)

Image

We now apply Lagrange’s2 method of undetermined multipliers. Multiplying Eq. (B-12) by α and Eq. (B-13) by β, and adding both to Eq. (B-11), gives

2 Lagrange’s method is convenient for evaluating the extrema of a function of several variables where additional relations (restraints) are given between these variables.

(B-14)

Image

As the variations of δni; are independent, Eq. (B-14) is satisfied for any i only

when

(B-15)

Image

or

(B-16)

Image

The multiplier α can be eliminated using Eq. (B-3). It then follows that

(B-17)

Image

and, with Eq. (B-8),

(B-18)

Image

where Image is the probability that a given system of the canonical ensemble is in quantum state i with the energy eigenvalue Ei. Summation Σi is over all possible quantum states.

Statistical Analogues of Thermodynamic Properties in the Canonical Ensemble. Using the first postulate of statistical thermodynamics [Eq. (B-1)], any dynamic property of a closed isothermal system can be calculated with Eq. (B-18). For example, the energy E, which in classical thermodynamics is the internal energy U, is given by

(B-19)

Image

The summation term in the denominator is called the canonical partition function, Q:

(B-20)

Image

Ei(V, N) is the energy of that system of the ensemble that is in quantum state i, the term in parentheses (V, N) indicates the canonical ensemble. Partition function Q depends on β, V, and N. Partial differentiation of Eq. (B-20) shows that Eq. (B-19) can be written in the form

(B-21)

Image

We now identify the physical significance of coefficient β, introduced formally in Eq. (B-14); β is an intensive property whose value must be the same for all systems of the canonical ensemble because all of these systems are in thermal equilibrium with the same large heat bath. The only property that possesses this characteristic is the temperature. However, the dimensions of β are those of a reciprocal energy, different from those of the thermodynamic temperature. To establish the link between thermo-dynamic temperature T and statistical-mechanical parameter p we use a simple proportionality

(B-22)

Image

where k is a universal constant known as Boltzmann’s constant; if T is expressed in kelvins, k = 1.38066×10-23 J K-1.

Other statistical analogues of thermodynamic functions are obtained by mathematical operations. First, differentiation of Eq. (B-19) yields

(B-23)

Image

For a constant number of molecules in the system, dEi is given by

(B-24)

Image

Further, it can be shown that

(B-25)

Image

Substitution of Eqs. (B-24) and (B-25) into Eq. (B-23) gives

(B-26)

Image

From classical thermodynamics,

(B-27)

Image

Comparing Eqs. (B-26) and (B-27), the statistical analogue for the entropy S is

(B-28)

Image

which can be rewritten as

(B-29)

Image

Equation (B-29) is suitable for deriving the remaining analogues upon using the fundamental relations of classical thermodynamics (see Table 2-1). Table B-1 presents several statistical thermodynamic analogues in terms of the canonical partition function Q.

Table B-1 Statistical analogues of thermodynamic functions.

Image

To illustrate the statistical meaning of entropy, we consider the special case of an ensemble where all quantum states are equally probable (microcanonical ensemble):

(B-30)

Image

where W is the number of possible quantum states of the system. In this case, recalling that Image Eq. (B-28) reduces to the Boltzmann relation

(B-31)

Image

where W is the thermodynamic probability. Although (Eq. (B-31) is only a special case of the more general Eqs. (B-28) and (B-29), it can be used to draw an important conclusion: If a system is in its only possible quantum state (e.g., a perfect crystal at zero absolute temperature), W = 1 and

(B-32)

Image

Equation (B-32) is the statistical form of the third law of thermodynamics.

The Grand Canonical Ensemble

For some applications, it is convenient to use an ensemble where the systems can ex-change matter as well as energy. Such an ensemble is the grand canonical ensemble, that corresponds to a large number of open systems, each having fixed volume V, that are in internal equilibrium and that are able to exchange matter (molecules) and energy with their surroundings. To describe the thermodynamic state of such an ensemble, it is convenient to use as independent variables the temperature T, the volume V, and the chemical potentials μ1, μ2,… of components 1, 2,… of the systems. For simplicity, we here consider only a system with one component. Generalization to a multicomponent system is straightforward, as discussed in many books on statistical mechanics.

For the calculation of the ensemble average, we must know the quantum-state distribution of the systems. Further, it is necessary to know the probabilities of the quantum states in the systems of the grand canonical ensemble.

The procedure is similar to that used for calculating the properties of the canonical ensemble. We suppose that each of the K systems of the grand canonical ensemble is a cell of volume V. Each cell can exchange matter and energy with the other cells but the total ensemble is completely isolated, as illustrated in Fig. B-2. The dashed lines indicate that, in contrast to Fig. B-1, exchange of matter as well as energy is permitted.

Figure B-2 Grand canonical ensemble. There are K systems. Each one can ex-change energy and matter with K- 1 other systems.

Image

The ensemble is an isolated system of volume KV, total energy E, and total number of molecules Nt. The K systems may be in different energy states, but they can ex-change energy and matter with each other because each one is in contact with a large reservoir of heat and matter formed by the (K – 1) other systems.

The quantum state of the entire ensemble is determined if the number of molecules N and the energy eigenvalues Ej(N, V) of all its systems are known. If n1(N), n2(N),…, nj(N),… systems contain N molecules each with the energy eigenvalues E1(N, V), E2(N, V),… Ej(N, V),…respectively, then the values n1(N), n2(N),…, nj(N),… determine the distribution of the systems of the grand canonical ensemble with respect to the states of energy for given N and V. In contrast to the canonical ensemble, N is now not fixed. The total number of quantum states Ω(n) for a given distribution n is

(B-33)

Image

There are many of these distributions, but all have to fulfill the conditions:

(B-34)

Image

(B-35)

Image

(B-36)

Image

The probability that for a given distribution, a particular system contains N molecules and is in quantum state j, is given by

(B-37)

Image

The probability has to be averaged over all possible distributions. As the number of systems in the ensemble becomes very large (K → ∞), the maximum-term method (see App. B.2) can be used and the average probability can be replaced by the most probable value Image

(B-38)

Image

where Image is the number of systems of the grand canonical ensemble that have the most probable distribution. Each of these systems contains N molecules and is in the quantum state with energy eigenvalue Ej(N).

To calculate Ω(n*), the maximum value of Ω(n), (Eq. (B-33) is written as

(B-39)

Image

The maximum condition is given by

(B-40)

Image

Using Stirling’s formula (see App. B.3) and remembering that K is a constant, (Eq. (B-40) becomes

(B-41)

Image

Because K, Et, and N are constants, Eqs. (B-34), (B-35), and (B-36) can be written as

(B-42)

Image

(B-43)

Image

(B-44)

Image

Using Lagrange’s method, i.e., multiplying (Eq. (B-42) by α, (Eq. (B-43) by β, and (Eq. (B-44) by γ, and adding all three to (Eq. (B-41) gives

(B-45)

Image

As variations of δ nj(N) are independent, (Eq. (B-45) is satisfied for any j when

(B-46)

Image

or

(B-47)

Image

Using (Eq. (B-34), the multiplier a can be eliminated and

(B-48)

Image

and, combining with (Eq. (B-38),

(B-49)

Image

where Image is the probability that a randomly chosen system of the grand canonical ensemble contains N molecules and is in the quantum state with energy eigenvalue Ej(N, V).

Statistical Analogues of Thermodynamic Properties in the Grand Canonical Ensemble. Using (Eq. (B-49) with the first postulate, we can calculate any dynamic property of an open isothermal system. For example, the energy E that in classical thermodynamics is the internal energy U, is

(B-50)

Image

The summation term in the denominator is called the grand canonical partition function, Image

(B-51)

Image

where Image depends on β, γ, V, and N.

We obtain the number of molecules of the open, isothermal system from

(B-52)

Image

Summation Σi refers to all N and to all quantum states at given N. The physical significance of the multiplier β is the same as that given by Eq. (B-22).

Differentiation of Eq. (B-50) yields

(B-53)

Image

The first term on the right-hand side of Eq. (B-53) can be written as

(B-54)

Image

Considering an open, isothermal system that can perform only mechanical (PV) work, the second term on the right-hand side of Eq. (B-53)3 is given by

3 In a group of systems of the ensemble with a given number N, Ej(N, V) can only vary with the volume.

(B-55)

Image

Substitution of Eqs. (B-54) and (B-55) into Eq. (B-53), and using Eq. (B-22), gives

(B-56)

Image

The corresponding relation from classical thermodynamics is

(B-57)

Image

where N is again the number of molecules (not moles) and μ is the chemical potential for one molecule. Comparison of Eqs. (B-56) and (B-57) yields the statistical analogues for the entropy and the chemical potential:

(B-58)

Image

and

(B-59)

Image

Using Eqs. (B-22) and (B-59), the grand canonical partition function [Eq. (B-51)] can be written

(B-60)

Image

Partial differentiation of Eq. (B-60) shows that Eq. (B-50) can be rewritten in the form

(B-61)

Image

and Eq. (B-58) becomes

(B-62)

Image

With Eqs. (B-61) and (B-62), the statistical analogues of other thermodynamic proper-ties can be derived in terms of the grand canonical partition function using the relations of classical thermodynamics (see Table 2-1). The results are given in Table B-1.

The Semiclassical Partition Function

When calculating the canonical partition function, it is convenient to examine separately the energy contributions of the various molecular degrees of freedom. The most important factorization separates the translational contribution Qtrans (due to the positions and motions of the centers of mass of the molecules) from all others, due to other degrees of freedom such as rotation and vibration. These other degrees of freedom are often called internal degrees of freedom, although they are internal only for small spherical molecules. In this context, internal means independent of density.

The factored partition function has the form

(B-63)

Image

where Qint is assumed to be independent of volume, i.e., Qint has the same value for a dense fluid or solid as for an ideal gas. Equation (B-63) is strictly valid only for monatomic fluids (e.g., argon), but also provides a good approximation for molecules of nearly spherical symmetry like CC14 and C(CH3)4. It is, however, not valid for systems containing large asymmetric molecules or molecules interacting through strong dipolar forces or hydrogen bonding as found, for example, in alcohols. Since polar forces have an angular dependence, the rotation of a dipole depends on the positions of the centers of mass.

In the classical approximation, the translational partition function Qtrans is split into a product of two factors, one arising from the kinetic energy and the other from the potential energy. For a one-component system of N molecules, Qtrans is given by

(B-64)4

Image

4 For a derivation of Eqs. (B-65) and (B-66), see one of the references at the end of this appendix.

where

(B-65)

Image

and

(B-66)

Image

In these equations, m is the molecular mass, k is Boltzmann’s constant, and h is Planck’s constant; Γt(r1,…, rN) is the potential energy of the entire system of N molecules whose positions are described by r1,…, rN. For a given number of molecules, the first factor, Qkin, in Eq. (B-64) depends only on temperature. The second factor, ZN, the configurational partition function, depends on temperature and volume. Hence, the configurational part provides the only contribution that depends on intermolecular forces. However, it should be noted that ZN is not unity for an ideal gas (Γt = 0) because in that event,

(B-67)

Image

The complete canonical partition function may now be written

(B-68)

Image

Equation (B-68) is the basic relation for the statistical thermodynamics of dense and dilute fluids whose molecules interact with central forces.

Configurational and Residual Properties. From Eq. (B-68), In Q is the sum of internal, kinetic, and configurational parts,

(B-69)

Image

When Eq. (B-69) is substituted into any of the equations in Table B-I, we obtain separate contributions to the thermodynamic properties. Thus, for a property X,

(B-70)

Image

where the superscripts indicate the portion of the partition function from which the term is derived. The internal and kinetic parts are identical to those for an ideal gas. The term Xconf, called the configurational property, arises from In ZN, the configurational part of the partition function. Equations for Xconf are obtained by replacing Q by ZN in the equations of Table B-I. Thus, the configurationa! pressure, internal energy, and Helmholtz energy are

(B-71)

Image

(B-72)

Image

(B-73)

Image

The configurational properties are the only contributions that depend on inter-molecular forces, as indicated in the previous section. It should be noted that these contributions are not zero for idea! gases (no intermolecular forces) because then ZN is given by Eq. (B-67).

The dimensions of ZN are VN. Therefore, the value calculated for properties such as Aconf or Sconf, proportional to In ZN, depends on the units chosen for ZN. These are not arbitrary but must be the same as those chosen for (Qkin)-1 because the complete partition function Q is dimensionless. When reporting values of the configurational Helmholtz or Gibbs energy and entropy, it is therefore necessary to state the units of ZN. This difficulty does not arise for Pconf, Uconf, and Hconf because d In ZN = d ZN/ZN; in this case the dimensions of ZN cancel.

It is sometimes useful to introduce an alternative (but equivalent) property called residual property.5 Residual functions are defined such that they give a direct measure of the contribution to the property from intermolecular forces at the given state condition. For any property X of a substance at a particular temperature, volume, and number of molecules, the residual X is defined by 6

5 The residual properties discussed here are different from the residual mixing properties discussed in Sec. 8.2.

6 Some authors define an excess property as the value of X for the real substance at temperature T and pressure P, minus Xid for the ideal gas at the same temperature and pressure:
                       XE(T,P,N) = X(T,P,N) - Xid(T,P,N)
This XE differs from XR defined in Eq. (8-74).

(B-74)

Image

where superscript R means residual. Thus the residual X is obtained by subtracting the ideal-gas property from that for the real fluid, both evaluated at the same T, V, and N. Since Xid is the value for the fluid at T, V, and N in the absence of intermolecular forces, XR represents the contribution to X from “turning on” intermolecular forces. Equations for residual properties can be obtained from those given in Table B-1 together with Eq. (B-67). Thus,

(B-75)

Image

(B-76)

Image

(B-77)

Image

Residual-property values are independent of the units for ZN provided that these are consistent with those of VN.

Appendix B.1: Two Basic Combinatorial Relations

Combinatorics deal with the arrangement of elements according to specified restrictions. An element designates a particle (sphere, molecule, etc.) that is the object of the arrangement.

1. The number of different arrangements W(n) of n distinguishable elements of an assembly is

(B-78)

Image

2. The number of different arrangements of an assembly of n = n1 + n2 +… + nk elements where n1, n2,…, nk elements are not distinguishable is

(B-79)

Image

Appendix B.2: Maximum-Term Method

In statistical mechanics, the logarithm of a summation often is replaced by the maximum term of that summation. This approximation is very good because, for a large number of elements, the most probable distribution outweighs all other distributions to such an extent that it is practically identical with the average distribution.

To fix ideas, consider the following example: An assembly of 2×l07 elements consists of 2×l04 groups. We compare the number of possible arrangements for two cases:

(a) Each group contains 1000 elements.

(b) Half of the groups contain 1001 elements and the other half 999 elements per group.

The number of possible arrangements W is [see App. B.1]

(B-80)

Image

The ratio of the two distributions is

(B-81)

Image

Equation (B-81) tells us that distribution (a) is 22,000 times more probable than distribution (b), although the latter is very close to the former.

In this example, we used only 2×l07 elements and 2×l04 groups. For systems encountered in statistical mechanics, the number of elements (molecules) and groups (energies) is significantly larger (1020 and more). The maximum-term method is therefore justified for realistic problems encountered in practical thermodynamics.

Appendix B.3: Stirling’s Formula

Stirling’s formula provides an excellent approximation for factorials of large numbers. By definition,

(B-82)

Image

Taking logarithms gives

(B-83)

Image

For large values of n, the right-hand side of Eq. (B-83) can be replaced by an integral:

(B-84)

Image

This integral can be evaluated:

(B-85)

Image

As n >> 1, Eq. (B-85) can be approximated by

(B-86)

Image

which is Stirling’s formula. Figure B-3 shows the percentage error of the formula for insufficiently large n. Note that the absolute error In n! – (nln n – n) increases monotonically with n, although the percent error fails. For n ≥ 100, the formula provides an excellent approximation (error less than 1%).

Figure B-3 Percent error in Stirling’s formula for low n.

Image

References

Andrews, F. C, 1975, Equilibrium Statistical Mechanics, 2nd Ed. New York: John Wiley & Sons.

Chandler, D., 1987, Introduction to Modem Statistical Mechanics. New York: Oxford University Press.

Chao, K. C. and R. A. Greertkom, 1975, Thermodynamics of Fluids: An Introduction to Equilibrium Theory, Chap. 2. New York: Marcel Dekker.

Denbigh, K., 1981, The Principles of Chemical Equilibrium, 4th Ed., Chaps. 11-14. Cambridge: Cambridge University Press.

Everdell, M. H., 1975, Statistical Mechanics and Its Chemical Applications. New York: Academic Press.

Garrod, C., 1995, Statistical Mechanics and Thermodynamics. Oxford: Oxford University Press.

Gupta, M. C., 1991, Statistical Thermodynamics. New York: John Wiley & Sons

Hill, T. L., 1986, An Introduction to Statistical Thermodynamics. Reading: Addison-Wesley.

Maczek., A., 1998, Statistical Thermodynamics. Oxford: Oxford University Press.

McQuarrie, D. A., 1976, Statistical Mechanics. New York: Harper & Row.

McQuarrie, D. A., 1985, Statistical Thermodynamics. Mill Valley: University Science Books.

Reed, T. M. and K. E. Gubbins, 1973, Applied Statistical Mechanics. New York: McGraw-Hill (Reprinted by Butterworth-Heinemann, 1991).

Tien, C. L. and J. H. Lienhard, 1988, Statistical Thermodynamics. Washington: Hemisphere.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset