Introduction

to

Statistical

Thermodynamics

1. The Boltzmann’s distribution

Given three distinguishable particles, A, B and C.[1] Each one of these particles is allowed to have the energy 0, x, 2x, 3x, … etc. Suppose that the total energy (internal energy) of the system consisting of these three particles is constant and equals 3x. The question now is how this energy can be distributed over the three particles; or, rephrased, what the various possibilities of distributing the particles over the energy levels are so that the total energy is 3x. As the following diagram shows, there are 10 possible ways to achieve the required state. These possible arrangements are also called complexions. The above required state is thus represented by ten complexions. It is important to notice that the energy of any given particle is, as a result of the continuous collisions between these particles and the accompanying energy exchange, not constant with time; the total energy, however, is (isolated system).

The fundamental assumption of statistical thermodynamics is that all possible complexions are equally probable. In other words, there is no preference of one arrangement over the other. Any particle is likely to have one energy value as the other.

The 10 complexions shown above represent three micro-states X, Y and Z. Each micro-state is defined by its own population distribution (i.e. the ni profile, where ni is the number of particles occupying the ith energy level). Consequently, one complexion corresponds to the micro-state X, three complexions correspond to Y and six complexions correspond to Z. We conclude therefore that it is more probable to find the system in the micro-state Z than in the micro-states X or Y. In general, the higher the number of complexions corresponding to one state is, the more probable is this state.

Our next task is to derive a general formula to calculate the number of complexions corresponding to a specific state, i.e. that with n0 particles in the energy level0, n1 particles in the energy level 1, n2 particles in the energy level 2, and so on. The above problem is identical with having a sac containing N distinguishable balls (e.g. of different colors). The balls are to be distributed in a certain number of boxes so that n balls come in the first box, n’ come in the second box, n’’ in the third box, and so on. The number of different possibilities achieving that state can be shown to be given by

(1.1)

Notice that the number of possibilities has been divided by the number of permutations of the particles in a certain energy level among themselves because the order of pulling out the balls is not important. To elucidate this point further consider the state X in the system described above. The first energy level contains three particles A, B and C. It is not important whether A then B then C were pulled out of the sac or A then C then B or B then C then A … etc.[2] The six possibilities (see footnote 1) reduce thus to a single complexion! Applying the above equation for the states X, Y and Z gives[3]

Equation (1.1) has very important consequences as the number of particles N grows very large. Suppose the system contains one mole of particles. The number of possible arrangements (complexions) of having these Nav particles distributed over Nav energy levels (i.e. a single particle per energy level, as in the Z state) is Nav!. The number of possible arrangements (complexions) of having all these Nav particles in the same energy level (as in the X state) is 1. The chance of finding the system in the X state is now practically zero (compared to 10% when the total number of particles was just 3). In other words, the system is expected to be found entirely in the state with maximum number of complexions (the Z state in this case). This state is not just the most probable; it is the stable state that can only be formed. It is the state of equilibrium.Small fluctuation from this state may occur, however with less probability. Suppose that one of the Nav drops from its energy level to the one below. As a result another particle must jump from its energy level to the one above in order to keep the energy constant. The probability of finding the system in this state is W=N!/4. If two particles drop to a lower energy level and two jump to a higher one, then W=N!/8.

The question now arises whether it is possible to determine the most probable state (the niprofile over the energy levels). Mathematics delivers the answer. We are looking for the maximum value of W which is again the maximum value of lnW.[4] The first derivative must thus be zero. We make also use of the fact that W is a state function having an exact differential.

Two constraints now apply:

According to Lagrange, we introduce the undetermined multipliers  and . Multiplying the two constraints with  and , respectively, and adding dlnW leads to equation 1.2.

(1.2)

Equation 1.2 is actually an equation system:

which can be summarized as

(1.3)

In the above equation system, there are i variables with two conditions (constant N and constant E). That means there are i-2 independent variables. Now let us choose  and  so that

The equation system 1.3 reduces thus to

Since these i-2 variables are independent, then changing one of them doesn’t affect the others. This means that the sum of all terms that involve that specific variable i must, independently of the others, be zero, i.e.:

(1.4)

We’ll try now to evaluate (dlnW/dni) and will thereby make use of the Stirling’s theorem:

If N is very large, the above formula reduces to

Applying the Stirling’s theorem in equation 1.1 yields

and

(1.5)

Substituting equation 1.5 in equation 1.4 gives

Since a is constant, ea is constant and is set equal to A:

(1.6)

Equation 1.6 is the so-called Boltzmann’s distribution. This is the distribution of the most probable state which represents the equilibrium distribution. Fluctuations from this state are limitingly small.

If the ith energy level were degenerate with the degeneracy gi, equation 1.6 can be shown to become

(1.7)

Equation 1.7 can be applied also even if the energy states are not exactly degenerate but are very close to each other and fall within the energy range of d. In such case, one speaks of energy bundles of d-width and gi is the number of energy states within this bundle. This is very useful when treating continuous energies as will be shown in the case of translational energy.

With help of equation 1.6 or equation 1.7, the fraction of particles with the energy i can be determined (this is the probability of finding the particle in the ith energy level):

(1.8)

and the ratio of particles in the ith energy level to those in the jth energy level is given by

(1.9)

The summation () in the above expressions is called the partition function and is given the symbol q. This function is very important in statistical thermodynamics. When known, all thermodynamic properties of the system can be calculated. More to this issue and to the physical meaning of the partition function is found in section ??.

The constant A in the Boltzmann’s distribution (eq.1.6 or 1.7) is determined as follows:

2. Entropy and disorder

As explained in section 1, the most stable state is that with maximum number of complexions. On the other hand and according to the laws of thermodynamics, the equilibrium state is characterized by maximum entropy. It seems therefore plausible to assume that entropy is associated with the number of complexions. In the language of mathematics,

Let us now consider two subsystems A and B:

Differentiating with respect to WB gives:

Differentiating with respect to WA gives:

For conveniency, the constant W0 is set to be zero and we are left with the famous equation for statistical entropy engraved on Boltzmann’s tombstone in Vienna:

(2.1)

Above equation relates entropy with disorder. The larger the number of complexions corresponding to a given state, the larger is the disorder in this state. Disordered states are more stable because they can be achieved in more possible ways than ordered ones. The constant k in equation 2.1 is the so-called Boltzmann’s constant. Interestingly, Boltzmann himself didn’t determine its value. It was Planck who first made in 1900 an estimation of this value from his solution of the black body radiation problem.

By correlating statistical mechanics with thermodynamics, the value of  can be determined:

Suppose now that the system is supplied slowly with heat. The number of particles will not thereby change (i.e. dni=0) but their distribution on the various ienergy levels will. The heat supplied to the system (dQ) is then equal to the change in the system energy (idni). Substituting in the above expression yields thus:

The definition of entropy in thermodynamics is given by

Then, by applying equation 2.1

Comparing the two expressions for dlnW/dQ gives

and the Boltzmann’s distribution now reads

(2.2)

We’ll try now to evaluate the Boltzmann’s constant k. Consider a particle in three-dimensional box with the dimensions a, b and c. The energy of such particle is given by quantum mechanics to be

p, q and r are thereby positive integers (quantum numbers). Applying the above expression for energy in the Boltzmann’s distribution gives

All constants in the exponent can be reduced to a single constant. Thus,

Because the energy levels are very close to each other, the summation can be replaced by integration. The resulting integration is a standard one.

The same procedure is applied over the other two summations over q and r. The total number of particles is then given by

From Boltzmann’s distribution

From the kinetic gas theory

Now comparing with the ideal gas law (pV=nRT) yields

The Bolzmann’s constant is thus the general gas constant per particle.

3. The partition function

3.1 The molecular partition function and its interpretation

It is a sum of Boltzmann factors, , that specify how the particles are partitioned throughout the accessible states. The numerical value of the partition function represents the effective number of energy levels thermally accessible for the particle. To elucidate this point, consider a two-level system[5]; the lower energy level has the energy zero and the second energy level has the energy 1. The partition function reads

At very low temperatures (T→0), the second term approaches zero (e-∞) and q equals 1. This means that the particle only exists in the first energy level. As the temperature is increased, the second term increases and the value of q increases. A q-value higher than 1 means that the particle has now also access to the second energy level but with lower probability than for existing in the first energy level. As T approaches ∞, goes to unity and q =2. This means that the particle can occupy the two levels with equal probability.

Another important example is a system with infinite number of equidistant energy levels at 0, , 2, 3, …etc, whereis the energy difference between two adjacent energy levels. This resembles the vibrational energy levels of harmonic oscillator. The corresponding partition function q reads

The number of thermally accessible levels depends clearly on the ratio kT/ where kT represents the thermal energy supplied by the surroundings. The higher the thermal energy (the higher the temperature), the more are the levels that the molecule can exist in. Note that as the energy level increases, the probability to exist at that energy level decreases (Boltzmann’s distribution).

The vibrational energy of harmonic oscillator differs slightly from the ladder system described above since the energy of the zeroth level is not zero. According to the laws of quantum mechanics the energy of the zeroth vibrational level is the zero point energy which is equal to 1/2ho, where o is the fundamental frequency of oscillation. The Boltzmann’s distribution requires however that the energy of the zeroth level is zero. In order to apply the Boltzmann’s distribution to the vibrational energy of harmonic oscillator, the vibrational energies must be shifted donwards so that o becomes zero. This is achieved by subtracting the zero point energy from the actual vibrational energies. Taking in consideration that the energy difference between any two adjacent energy levels of the harmonic oscillator is ho

(3.1)

Exercise: Calculate the fraction of 1H35Cl (o=2886 cm-1) molecules present in the zeroth vibrtional energy level at room temperature and at 1000ºC. Do the same calculation for the 127I-35Cl molecules (search for the fundamental frequency!). Compare the results and explain the observed variation.

The partition function of rotation is more difficult to compute. For a linear molecule, the energy of rotation is given by

where I is the moment of inertia, B the rotational constant and J the rotational quantum number with values 0, 1, 2, …etc. The rotational levels show 2J+1-fold degeneracy. The partition function thus reads

(3.2)

Given the value of B, the partition function can be evaluated numerically (i.e. for each value of J, the term is calculated, the terms are then summed up; you will see that the series above converges to a certain value).

Exercise:Use Excel or Origin software to evaluate the rotational partition function of 1H35Cl at 25ºC. B=10.591 cm-1.

When the thermal energy kT is much larger than the energy difference between two neighboring rotational levels, the sum in equation 3.2 can be approximated by an integral. Equation 3.2 becomes

(3.3)

Exercise: Evaluate the rotational partition function of 1H35Cl at 25ºC using equation 3.3.

To derive an expression for the partition function of translation per degree of freedom, we consider the particle to behave as a particle in a box. Its energy is thus given by

where a is the box length and p is a positive integer. The partition function can be approximated by an integer because the energy levels are very close to each other (continuum).

with ,

(3.4)

The quantity is given the symbol  and called the thermal wavelength. For three dimensions;

Notice that the partition function of translation depends on the size of the container.

Exercise: Calculate the translational partition function of an H2 molecule confined to 100 cm3 at 25ºC.

In the following it is shown that the total partition function of a molecule is the product of individual partition functions. Assuming that the various energy contributions of a molecule are independent (Born-Oppenheimer Approximation):

(3.5)

The same procedure can be used to show that the total partition function of any energy contribution is the product of the individual partition functions per degree of freedom:

(3.6)

(3.7)

3.2 Significance of the partition function

The importance of the molecular partition function lies in the fact that it contains all information needed to calculate all macroscopic thermodynamic properties of systems of independent particles[6],[7].Since the molecular partition function is related to a single molecule, so-called system partition functionsQ are defined for systems containing N particles (as the case is in real systems). It can be shown that for a system containing N distinguishable independent particles Q=qN. For N indistinguishable independent particles, Q=qN/N! because permutations of the particles among themselves have to be counted.

a)Internal Energy

Taking in consideration that the internal energy at zeroth level is not necessarily zero:

b)Entropy

c)Helmholtz free energy

d)Pressure

e)Free Gibbs energy

The free Gibbs energy of indistinguishable ideal gas particles requires special attention.

The last equation gives a new interpretation for the free Gibbs energy. It is proportional to the logarithm of the number of thermally accessible states per molecule. Introducing the molar partition function qm when N=Nav yields

(3.8)

Exercise: Calculate the entropy of a collection of N independent I2 molecules (o=214.6 cm-1) at 25ºC assuming harmonic oscillator behavior.

For I2 at 25ºC:

From graph, S = 8.4 Jmol-1K-1.

Exercise: Evaluate the molar entropy of N two level systems and plot the resulting expression.

Exercise: Calculate the vibrational contribution to entropy of Br2 at 600 K given that the wavenumber of the vibration is 321 cm-1. Calculate the vibrational energy of the system.

Collision

Theory

4.1 Translational Energy Distribution

In this section, the Maxwell-Boltzmann distribution is to be derived. But first of all, some thoughts about the Planck's constant h have to be considered. The unit of h is Js. In basic SI units

which is equal to the unit of linear moment p=m.v (kgm/s) multiplied by the unit of position x. h, thus, has the dimension of moment×distance (p×x).

Let's consider now the so-called phase plane for one dimensional motion[8]. This is the plane constructed by plotting the linear moment of motion versus the position.

The smallest unit in such a plane is h (the cyan rectangle). Each discrete state (defined by the values of p and x) is represented by such a rectangle in the phase plane, corresponding to an energy i=px2/2m.

Next step is to determine the degeneracy gi of gaseous molecules. These are the energy states that are so close to each others and lie within the energy interval d. Obviously, this is the number of states (rectangles) belonging to that bundle represented by the rectangle dp×dx. Thus,

For three dimensional motion:

Applying equations 1.8 and 3.4, taking thereby in consideration that the separation between the energy levels of translational motion are so small that the summation can be replaced by integration

The last equation above gives the fraction of particles with kinetic energy i located between 0-a irrespective of position. It is usually written in the form

(4.1)

`The velocity distribution according to equation (4.1) is shown below for N2 molecules at 298 K and 1500 K. The y-axis represents thereby the fraction of particles with velocity vx. Note that the curve is symmetrical and extends to infinity in both directions (the direction of the one dimensional motion is thereby taken in consideration). The average speed is zero because motion in opposite directions cancels each other. As the temperature is increased, the curve broadens since higher temperatures means higher kinetic energies which in turn means higher velocities. The curve center remains at zero because no direction is preferred over the other. Notice the absolute value of the speed with which the molecules move.