Values of k[1] Units
1.3806488(13)×10−23 JK−1
8.617 3324(78)×10−5 eV K−1
1.3806488(13)×10−16 erg K−1
For details, see Value in different units below.

The Boltzmann constant (k or kB), named after Ludwig Boltzmann, is a physical constant relating energy at the individual particle level with temperature. It is the gas constant R divided by the Avogadro constant NA:

 $k = \frac{R}{N_{\rm A}}.\,$

It has the same dimension (energy divided by temperature) as entropy.

## Bridge from macroscopic to microscopic physics

Boltzmann's constant, k, is a bridge between macroscopic and microscopic physics. Macroscopically, the ideal gas law states that, for an ideal gas, the product of pressure p and volume V is proportional to the product of amount of substance n (in moles) and absolute temperature T:

 $pV = nRT \,$

where R is the gas constant (8.314 4621(75) J K−1 mol−1[1]). Introducing the Boltzmann constant transforms the ideal gas law into an alternative form:

 $p V = N k T \,$

where N is the number of molecules of gas. For n = 1 mol, N is equal to the number of particles in one mole (Avogadro's number).

The left-hand side of the equation is a macroscopic amount of pressure-volume energy representing the state of the bulk gas. The right-hand side divides this energy into N units, one for each gas particle, each of which has an average kinetic energy equal to kT.

## Role in the equipartition of energy

Given a thermodynamic system at an absolute temperature T, the thermal energy carried by each microscopic "degree of freedom" in the system is on the order of magnitude of kT/2 (i. e., about 2.07×10−21 J, or 0.013 eV, at room temperature).

### Application to simple gas thermodynamics

In classical statistical mechanics, this average is predicted to hold exactly for homogeneous ideal gases. Monatomic ideal gases possess three degrees of freedom per atom, corresponding to the three spatial directions, which means a thermal energy of 1.5kT per atom (in the general case, DkT/2, where D is the number of spatial dimensions). This corresponds very well with experimental data. The thermal energy can be used to calculate the root-mean-square speed of the atoms, which is inversely proportional to the square root of the atomic mass. The root mean square speeds found at room temperature accurately reflect this, ranging from 1370 m/s for helium, down to 240 m/s for xenon.

Kinetic theory gives the average pressure p for an ideal gas as

 $p = \frac{1}{3}\frac{N}{V} m \overline{v^2}.$

Substituting that the average translational kinetic energy is

 $\tfrac{1}{2}m \overline{v^2} = \tfrac{3}{2} k T$

gives

 $p = \frac{N k T}{V}$

so the ideal gas equation is regained.

The ideal gas equation is also obeyed closely by molecular gases; but the form for the heat capacity is more complicated, because the molecules possess additional internal degrees of freedom, as well as the three degrees of freedom for movement of the molecule as a whole. Diatomic gases, for example, possess a total of seven degrees of simple freedom per molecule that are related to atomic motion (three translational, two rotational, and two vibrational). At lower temperatures, not all these degrees of freedom may fully participate in the gas heat capacity, due to quantum mechanical limits on the availability of excited states, at the thermal energy available.

## Role in Boltzmann factors

More generally, systems in equilibrium at temperature T have probability p of occupying a state i with energy E weighted by the corresponding Boltzmann factor:

 $p_i \propto \frac{\exp\left(-\frac{E}{kT}\right)}{Z},\$

where Z is the partition function. Again, it is the energy-like quantity kT which takes central importance.

Consequences of this include (in addition to the results for ideal gases above) the Arrhenius equation in chemical kinetics.

## Role in the statistical definition of entropy

Boltzmann's grave in the Zentralfriedhof, Vienna, with bust and entropy formula.

In statistical mechanics, the entropy S of an isolated system at thermodynamic equilibrium is defined as the natural logarithm of W, the number of distinct microscopic states available to the system given the macroscopic constraints (such as a fixed total energy E):

 $S = k\,\ln W.$

This equation, which relates the microscopic details, or microstates, of the system (via W) to its macroscopic state (via the entropy S), is the central idea of statistical mechanics. Such is its importance that it is inscribed on Boltzmann's tombstone.

The constant of proportionality k serves to make the statistical mechanical entropy equal to the classical thermodynamic entropy of Clausius:

 $\Delta S = \int \frac{{\rm d}Q}{T}.$

One could choose instead a rescaled dimensionless entropy in microscopic terms such that

 ${S' = \ln W} \; ; \; \; \; \Delta S' = \int \frac{\mathrm{d}Q}{kT}.$

This is a rather more natural form; and this rescaled entropy exactly corresponds to Shannon's subsequent information entropy.

The characteristic energy kT is thus the heat required to increase the rescaled entropy by one nat.

## Role in semiconductor physics: the thermal voltage

In semiconductors, the relationship between the flow of electrical current and the electrostatic potential across a p-n junction depends on a characteristic voltage called the thermal voltage, denoted VT. The thermal voltage depends on absolute temperature T as

 $V_T = { k_BT \over q },$

where q is the magnitude of the electrical charge on the electron with a value 1.602 176 565(35)×10−19 C[1] and k is the Boltzmann constant in Joules/K. In electronvolts, the Boltzmann constant is 8.617 3324(78)×10−5 eV/K,[1] making it easy to calculate that at room temperature (≈ 300 K), the value of the thermal voltage is approximately 25.85 millivolts ≈ 26 mV [1]. The thermal voltage is also important in plasmas and electrolyte solutions; in both cases it provides a measure of how much the spatial distribution of electrons or ions is affected by a boundary held at a fixed voltage.[2][3]

## History

Although Boltzmann first linked entropy and probability in 1877, it seems the relation was never expressed with a specific constant until Max Planck first introduced k, and gave an accurate value for it (1.346×10−23 J/K, about 2.5% lower than today's figure), in his derivation of the law of black body radiation in 1900–1901.[4] Before 1900, equations involving Boltzmann factors were not written using the energies per molecule and the Boltzmann constant, but rather using a form of the gas constant R, and macroscopic energies for macroscopic quantities of the substance. The iconic terse form of the equation S = k log W on Boltzmann's tombstone is in fact due to Planck, not Boltzmann. Planck actually introduced it in the same work as his h.[5]

As Planck wrote in his Nobel Prize lecture in 1920,[6]

This constant is often referred to as Boltzmann's constant, although, to my knowledge, Boltzmann himself never introduced it — a peculiar state of affairs, which can be explained by the fact that Boltzmann, as appears from his occasional utterances, never gave thought to the possibility of carrying out an exact measurement of the constant.

This "peculiar state of affairs" can be understood by reference to one of the great scientific debates of the time. There was considerable disagreement in the second half of the nineteenth century as to whether atoms and molecules were "real" or whether they were simply a heuristic, a useful tool for solving problems. Nor was there agreement as to whether "chemical molecules" (as measured by atomic weights) were the same as "physical molecules" (as measured by kinetic theory). To continue the quotation from Planck's 1920 lecture:[6]

Nothing can better illustrate the positive and hectic pace of progress which the art of experimenters has made over the past twenty years, than the fact that since that time, not only one, but a great number of methods have been discovered for measuring the mass of a molecule with practically the same accuracy as that attained for a planet.

## Value in different units

1.380 6488(13)×10−23 J/K SI units, 2010 CODATA value, J/K = W/m2·K4 = m2·kg/(s2·K) in SI base units[1]
8.617 3324(78)×10−5 eV/K 2010 CODATA value[1]
electronvolt = 1.602 176 565(35)×10−19 J[1]

1/kB = 11 604.519(11) K/eV

2.083 6618(19)×1010 Hz/K 2010 CODATA value[1]
1 Hz·h = 6.626 069 57(29)×10−34 J[1]
3.166 8114(29)×10−6 EH/K EH = 2Rhc = 4.359 744 34(19)×10−18 J[1]
= 6.579 683 920 729(33) Hz·h[1]
1.380 6488(13)×10−16 erg/K CGS system, 1 erg = 1×10−7 J
3.297 6230(30)×10−24 cal/K Steam table calorie = 4.1868 J
1.832 0128(17)×10−24 cal/°R degree Rankine = 5/9 K
5.657 3016(51)×10−24 ft lb/°R foot-pound force = 1.355 817 948 331 4004 J
0.695 034 76(63) cm−1/K 2010 CODATA value[1]
1 cm−1 ·hc = 1.986 445 683(87)×10−23 J
0.001 987 2041(18) kcal/mol/K per mole form often used in statistical mechanics—using thermochemical cal=4.184 Joules
0.008 314 4621(75) kJ/mol/K per mole form often used in statistical mechanics
4.10 pN·nm kT in piconewton nanometer at 24°C, used in biophysics
−228.599 1678(40) dBW/K/Hz in decibel watts, used in telecommunications
1.442 695(04) bit in bits (logarithm base 2), used in information entropy
1 nat in nats (logarithm base e), used in information entropy (see Planck Units, below)

Since k is a physical constant of proportionality between temperature and energy, its numerical value depends on the choice of units for energy and temperature. The Kelvin temperature scale is based on the Celsius scale which divides the temperature range of liquid water into one hundred increments. The small numerical value of the constant in the metric system reflects the small energy in joules required to increase a particle's energy by raising the temperature by 1 K. The characteristic energy kT is a term encountered in many physical relationships.

### Planck units

The Boltzmann constant provides a mapping from this characteristic microscopic energy E to the macroscopic temperature scale T = E/k. In physics research another definition is often encountered in setting k to unity, resulting in the Planck units or natural units for temperature and energy. In this context temperature is measured effectively in units of energy and the Boltzmann constant is not explicitly needed.[7]

 $E = \frac{1}{2} T \$

This simplifies many physical relationships and makes the definition of thermodynamic entropy coincide with that of information entropy:

 $S = - \sum P_i \ln P_i.$

where Pi is the probability of each microstate.

The value chosen for a unit of the Planck temperature is that corresponding to the energy of the Planck mass or 1.416 833(85)×1032 K.[1]

## References

1. P.J. Mohr, B.N. Taylor, and D.B. Newell (2011), "The 2010 CODATA Recommended Values of the Fundamental Physical Constants" (Web Version 6.0). This database was developed by J. Baker, M. Douma, and S. Kotochigova. Available: http://physics.nist.gov/constants [Thursday, 02-Jun-2011 21:00:12 EDT]. National Institute of Standards and Technology, Gaithersburg, MD 20899.
2. ^
3. ^ Tabeling (2006), Introduction to Microfluidics
4. ^ . English translation: "On the Law of Distribution of Energy in the Normal Spectrum".
5. ^ Duplantier, Bertrand (2005). "Le mouvement brownien, 'divers et ondoyant'" [Brownian motion, 'diverse and undulating'] (pdf). Séminaire Poincaré 1 (in French): 155–212.
6. ^ a b
7. ^ Kalinin, M; Kononogov, S (2005), "Boltzmann's Constant, the Energy Meaning of Temperature, and Thermodynamic Irreversibility", Measurement Techniques 48 (7): 632–36, doi:10.1007/s11018-005-0195-9