7 Nov 2011 ... Next 4 weeks: fundamentals of statistical mechanics. This ends with a partial
exam, (25% of grade. • Book: McQuarrie “Statistical Mechanics”, ...
Version: November 7, 2011.
MIT 3.20, Rickard Armiento, Lecture 1
Lecture 1: Review of Quantum Mechanics, Introduction to Statistical Mechanics, Time Averages vs. State Averages, First Postulate • October 14, 2011, McQuarrie 1.1 - 2.1
Formalities Lectures by: Rickard Armiento, armiento mit.edu, (617) 715-4275, room 13-5065. • Next 4 weeks: fundamentals of statistical mechanics. This ends with a partial exam, (25% of grade. • Book: McQuarrie “Statistical Mechanics”, (alt. “Statistical Thermodynamics”; both versions work). Chapters 1-6 + 11.
Introduction to Statistical Mechanics Thermodynamics = very general macroscopic theory. Statistical mechanics connects microscopic models with thermodynamics. Statistical Mechanics = study of macroscopic systems from an atomistic point of view. “Many, many particles + statistics → the macroscopic world”. Example: heat capacity. Why is the temperature dependence of the heat capacity of Ag shaped like this? Solid of N vibrating ions ⇒? thermodynamical quantities, Cv , etc., Cv grows with T and approaches 3kB T . Will find that entropy is connected to the “uncertainty” of microscopic states. Counting of states → we will frequently enter into combinatorics. Goal: understand and predict macroscopic phenomena starting from models of microscopic interactions between the atoms/particles in the system + statistics.
1
Statistical Mechanics
Statistical Mechanics connects microscopic models with thermodynamics. • What is entropy and internal energy in “reality”? • Can we understand the second + third law? Many, many particles + statistics → the macroscopic world.
Quantum Mechanics
Statistical Mechanics
Thermodynamics
Heat Capacity, Experimental vs. Model
MIT 3.20, Rickard Armiento, Lecture 1
Short Review of Quantum Mechanics
Review of Quantum Mechanics,1 • The state vector: |Ψi represents the state of the system. • Wave-function formulation: Ψ(¯q, t) ← wave function q¯ = {set of coordinates necessary}, e.g. q¯ = (¯r, σ ¯ ) × N. • Probability interpretation: probability that the system is between (¯q, q¯ + d¯q): Ψ∗ (¯q, t)Ψ(¯q, t)d¯q. • Probability that the system is in any state: Z Ψ∗ (¯q, t)Ψ(¯q, t)d¯q = 1 (normalized wavefunctions). all ¯ q
Review of Quantum Mechanics, 2 ¨ • Time dependent Schrodinger eq.: ˆ q, t) = i~ ∂Ψ(¯q, t) HΨ(¯ ∂t • Stationary state: Ψ(¯q) represents the system if the probability density does not change with t. ¨ • Ψ(¯q) is a solution to the time-independent Schrodinger eq. (an eigenvalue eq.) ˆ q) = EΨ(¯q), HΨ(¯ ~ ∂ 2 ψ(x) + v(x)ψ(x) = ψ(x) 2m ∂x2 Equation + boundary conditions ⇒ set of solutions: E.g., one particle, 1D:
−
Ψ1 , Ψ2 , ... and eigenvalues E1 , E2 , ...
2
MIT 3.20, Rickard Armiento, Lecture 1
ˆ q ) = EΨ(¯ HΨ(¯ q ) and boundary conditions ⇒ Ψ1 , Ψ2 , ... and eigenvalues E1 , E2 , ... Degenerate states:
Ω(E) = # states with energy E (discrete) = degeneracy Ω(E)dE = # states between E and E + dE. (continous) = density of states
Simple 1D Examples: 1. “Particle in a box”: Single particle in a 1d infinite well: h2 ∂ 2 ˆ =−¯ H + v(x); 2m ∂x2 r ˆ = ψ Hψ
⇒
ψn =
v(x) =
2 nπx sin( ), a a
0 if 0 < x < a ∞ otherwise
n =
h2 2 n , n = 1, 2, 3, ... 8ma2
a
2. Harmonic oscillator: h2 ∂ 2 1 ˆ =−¯ H + mω 2 x2 , 2 2m ∂x 2 r ψn =
mω 1 ¯hπ 2n n!
1/2
r hn
ˆ = ψ Hψ
mω 2 mω x e− h¯ x /2 , h ¯
⇒
n = (n + 1/2)¯ hω, n = 0, 1, 2, ...
ħω
Other examples: Finite well, delta-function potential, linear potential.
A First Meeting with Degenerate Microstates Degeneracy in typical 3D system: How degenerate are the states? Start with one particle in a box. ˆ = ψ: 3D infinite well: Hψ 2
ˆ = − ¯h H 2m
∂2 ∂2 ∂2 + + ∂x2 ∂y 2 ∂z 2
⇒ nx ,ny ,nz =
+ v(x, y, z),
v(x, y, z) =
h2 1 2 (n2 + n2y + n2z ) = (n + n2y + n2z ), 8ma2 x M x
0 if 0 < x, y, z < a ∞ otherwise
nx , ny , nz = 1, 2, 3, ...
Density of energy levels: E.g., m = 10−22 g, a = 10 cm ⇒ 1/M = 10−41 J = 10−22 eV. Very, very dense.
3
E = kx²/2
MIT 3.20, Rickard Armiento, Lecture 1 nx 5 4 3 2 1
Count the ways n2x + n2y + n2z give the same number. Degeneracy ω = one quadrant of a sphere: R2 = n2x + n2y + n2z . ω() = # states as a function of the energy,
r 1 2 3 4 5 ny
φ() = # states with energy ≤
Degeneracy in 3D well 25
9000 8000 7000 6000 5000 4000 3000 2000 1000 00 10
15
States, φ(²)
States ω(²)
20
10 5 00
4
2
M²
6
8
75 50 25 00 1 2 3 4 5
5
10
M²
15
20
25
Idea: φ ≈ Volume of sphere quadrant/(volume of one state): 1 φ() = 8
4πR3 3
π π = (M )3/2 = 6 6
8ma2 ¯h2
3/2 .
# states between and + ∆: π 3/2 ( + ∆)3/2 − 3/2 = M 6 π = f ( + ∆) ≈ f () + f 0 ()∆ + O(∆2 ) = M 3/2 1/2 ∆+O(∆2 ). 4 ω(, ∆) = φ( + ∆) − φ() =
Same numbers as before, one gas particle: m = 10−22 g, a = 10 cm, at room temperature: ∼ kB T with T = 300 K. The number of states in an energy interval of 1% (∆ = 0.01) is ω(, ∆) ∼ 1028 . Many indistinguishable non-interacting particles: (see McQuarrie 1.3), Ω(E, ∆E) ∼ 10N , N ∼ 1023 .
4
Many Noninteracting Indistinguishable Particles Ω(E, ∆E) =
1 Γ(N + 1)Γ[(3N/2)]
2πma2 h2
3N/2−1
⇒ Ω(E, ∆E) ∼ 10N , N ∼ 1023
E3N/2−1 ∆E
MIT 3.20, Rickard Armiento, Lecture 1
Time Averages and State Averages Many microscopic states ⇔ one macroscopic / thermodynamic state (e.g., fixed E; V , N .) Our particles in a box can be in many, many degenerate states ∼ 10N .
Microstates vs. Macroscopic state Macroscopic, "Thermo state"
E, V, N
Starting point of statistical mechanics: many macroscopic quantities we observe (e.g., p) are changing microscopic quantities that has been time averaged. The idea is valid both for classical physics and quantum mechanics. In classical physics each particle is described by position and momentum, q¯ = {¯ ri , p¯i }N i . Time average over classical microstates: ¯ = hE(t)i = 1 E ∆t
Z
t+∆t
E(¯ r, p¯)dt. t
Time average over quantum mechanical states: ¯ = hE(t)i = 1 E ∆t
Z
t+∆t
Microstates
Starting point of statistical mechanics: many macroscopic quantities we observe (e.g., p) are changing microscopic quantities that has been time averaged. 1st postulate: The ensemble average of a thermodynamic property is equivalent to the time-averaged macroscopic value of the property measured for the real system.
Time averages
ˆ hΨ(¯ q , t)|H|Ψ(¯ q , t)idt.
Macroscopic, "Thermo state"
t
E, V, N
p
Calculating these time averages is hard!
t
Ensemble of microstates: A copies of the system, all in the same thermodynamical state, with N , V and E fixed. All the degenerate Ω(E) states are present equal number of times = an “ensemble of points in the phase space”. 1st postulate: The ensemble average of a thermodynamic property is equivalent to the time-averaged macroscopic value of the property measured for the real system.
Microstates
Ensemble
State average over classical microstates: Z ¯ E = hE(t)i = E(¯ r, p¯)P (¯ r, p¯)d¯ rd¯ p. A collection of points (volumes) in phase space.
P (¯ r, p¯)d¯ rd¯ p ∼ measure of time that the system spends in a state between [¯ r; p¯, r¯ + d¯ r; p¯ + d¯ p].
5
MIT 3.20, Rickard Armiento, Lecture 1 State average over quantum mechanical states: ¯ = hE(t)i = E
X
Ei Pi
i
Formally: Pi = probability to find the system in state i. (Informally: “probability of state i”.)
Example: sleepy students: N = 40 students falling asleep / waking up independently, probability of being asleep = Psleep . How many are asleep?
n ¯ sleep
1st postulate 1 The ensemble = many copies of the lecture hall, = nsleep (t)dt = = hn(t)i = ∆t ∆t all possible combinations of sleeping students: 2N .
N
=
Z
2 X
nsleep Pi i
=
i=1
=
Group all states with same ni ; Probability that exactly n students are sleeping = Pn
N X n=0
=
N X n=0
N! nPn = n!(N − n)!
N! nP n (1 − Psleep )N −n = ... = N Psleep . n!(N − n)! sleep
Given Pi → we can calculate everything! This will be a central concept of statistical mechanics. Examples (glance ahead): Constant E, V, N → Pi constant. Constant N, V, T → Pi ∼ e−Ei /kT Different boundary conditions → different Pi .
Homework suggestions 1. 2. 3. 4.
Think about sleepy students example, why is count of states with the same ni = Exam problem 2001E2:3(a). Exam problem 2004E2:1(a). Exam problem 2007E2:3(a).
6
N! n!(N −n)! .
MIT 3.20, Rickard Armiento, Lecture 2
Lecture 2: Microcanonical and Canonical Ensemble • October 17, 2011, McQuarrie 2.1 - 2.2
Previous lecture Ensemble of microstates: a collection of microstates that all are in the same thermodynamical state.
Time averages Macroscopic, "Thermo state"
E, V, N
Postulate: The ensemble average of a thermodynamic property is equivalent to the time-averaged macroscopic value of the property measured for the real system.
p
t
Microstates
State average over quantum mechanical states: ¯ = hE(t)i = E
X
Ei Pi Ensemble
i
Formally: Pi = probability to find the system in microstate i at a specific instance in time. (Informally: “probability of state i”.)
The Microcanonical Ensemble Isolated system, constant volume ⇒ E, N , and V fixed (both microscopically, i.e., in each microstate; and macroscopically). Degenercay: Ω(E). Microcanonical ensemble contains all these degenerate states. Ensemble average? ¯ = hM (t)i = M
X
Mi P i .
i
Where do we get Pi ?
Equal a Priori Probability: Principle of equal a priori probability: Given an isolated system in equilibrium, it is found with equal probability in each of its accessible microstates.
Assume that all microstates are equally likely ⇒
7
A collection of points (volumes) in phase space.
MIT 3.20, Rickard Armiento, Lecture 2 State probability in microcanonical ensemble Pi = 1/Ω(E) = constant.
The ergodic hypothesis was one (historical) motivation for this postulate.
The Ergodic Hypothesis
(Isolated, noninteracting particles)
A
B For states degenerate in energy: over long time, we pass through them equally much.
Equal a priori probability:
PA = P B But of all possible degenerate states, How many "look like" A? Like B?
Pspread out >> Pon left side This specific hand: 2,598,960 : 1
This specific hand: 2,598,960:1
One pair 1.36 : 1
Royal straight flush 649,739 : 1
Microcanonical ensemble averages: ¯= E
X
Ei Pi =
i
1 X EΩ =E Ei = Ω(E) Ω i
¯ = M
X
p¯ =
X
Mi Pi =
i
i
1 X Mi Ω(E) i
pi Pi =
1 Ω(E)
X
pi
i
Canonical Ensemble System with constant N, V , but not thermally isolated. It is in thermal equilibrium with the environment at temperature T (=“macroscopic temperature”, we do not yet know what the temperature is in terms of microstates, i.e. “in the microscopic world” where all microscopic degrees of freedom are specified.) Canonical ensemble C = all microstates corresponding to fixed N , V , T .
8
MIT 3.20, Rickard Armiento, Lecture 2 Ensemble average: sum the microcanonical way?: ¯ =? M
X
Mi P i
i
But, what are the {Pi } now? System is not isolated ⇒ all accessible microstates generally not equally likely. Cunning plan: construct a microcanonical ensemble that simulates a system in the canonical ensemble + heat bath and use principle of equal a priori probabilities in the microcanonical ensemble to derive probabilities Pi .
Step 1 1
1
2
3
3
2
... 1
... 1
2
3
...
1
2
3
2
1
A 2 N,V,T
...
3
... A N,V,T
3 ... A N,V,T
A N,V,T
A N,V,T
Abstract canonical ensemble, all entries share the same N, V, T. All states NOT equally likely
𝒜 N,V,T
All possible states of heat bath = microcanonical ensemble. Every entry has the same total E=𝓔 , N=𝒩 , V=𝒱. All states equally likely
Simulated heat bath of 𝒜 systems, 1. Bring to thermal equilibrium at T 2. Isolate ⇒ E=𝓔 , N=𝒩 , V=𝒱.
Step 2 All possible states? Every microstate of the heat bath has the same total E=𝓔 , N=𝒩 , V=𝒱. 1
1
2
3
2
3
1
... 1
...
2
1
3 A
3
2
A 2 N,V,T
...
The abstract canonical ensamble = all possible subsystem states. Lay out 𝒜 subsystems in all ways that give E=𝓔 , N=𝒩 , V=𝒱. = Combinatorial problem! 1
3
2
...
...
3
... A
A A
1
2
3
...
A
A
Each microstate = same probability → Subsystem probability ~ number of times it occur.
Create a very large “heat bath” system out of A microstates from the canonical ensemble, labeled j = 1, 2, ... A . Bring the system to thermal equilibrium at T (using, e.g. an external heat bath), and then isolate it. All possible microstates of this heat bath system = a microcanonical ensemble
9
MIT 3.20, Rickard Armiento, Lecture 2 D ⇒ 1) All microstates in D have the same energy value = E . 2) Principle of equal a priori probabilities ⇒ finding the system in any of the possible microstates in D is equally likely. Main idea: determine the probabilities by counting how all the subsystems within the equally likely microstates in D distribute over the possible states: Pi =
Number of subsystems in state i within all the microstates in D Total number of subsystems in all the microstates in D
Average number of subsystems in state i over the microstates in D = Number of subsystems in one microstate of D
The canonical ensemble C contains every state a subsystem in D can be in. All possible microstates of D = every way systems in C can be mapped onto the labeled (thus distinguishable) subsystems of D under the constraint that the total energy sums to E ⇒ a straightforward combinatorial problem! Categorize entries in D according to the distribution of subsystems over quantum states: There are ai subsystems in state i, a = {ai }. Count possible quantum states in i, labeled subsystems in j. X X X X Number of microstates: A = 1= ai , total energy: E = Ej = ai Ei j
Total number of subsystems in state i
i
j
X
=
All distributions
=
X
i
ai in the Microstates in D with = × distribution this distribution
W (a)ai (a).
a
What is W (a)?: Number of ways A labeled subsystems can be divided with a1 in one group, a2 in a second group, etc. ...
W (a) =
A! A! = . a1 !a2 !... Πi ai !
Total number of subsystems in all systems in D: X Number of subsystems X X X X X = W (a)ai (a) = W (a) ai (a) = A W (a). in state i i
i
a
a
P ⇒
Pi =
W (a)ai (a) aP
A
a W (a)
=
i
Probability to find the system . in quantum state i
10
a
MIT 3.20, Rickard Armiento, Lecture 2 Finished? In principle we could calculate averages using this expression. X ¯ = Mi P i M i∈All states
But dealing with Pi in this form is still hard.
Most probable distribution Binomial distribution
W ({ai }) very peaked around a small set of {ai }. Larger and larger heat bath ⇒ A → ∞ ⇒ more and more peaked (See McQuarrie 1.5) Idea: find the most probable distribution a∗ that maximizes W ({ai }) over all {ai }. Assume it is enough to consider this distribution in the limit A → ∞, ai → ∞. P W (a)ai (a) 1 W (a∗ )a∗i 1 a∗i a P = {A → ∞} = Pi = = A A W (a∗ ) A a W (a) Purely Pmathematics: find the {ai } that maximizes W (a) = E = i ai Ei .
A! Πi ai ! ,
a*
constraints: A =
P
ai , and
W (a) contains large factorials. Maximize ln W (a) instead, so we can use Stirling’s approximation, ln n! = n ln n − n. Langrange multipliers (see McQuarrie 1.5) ⇒ ( ) X X ∂ ln W (a) − α ak − β ak Ek = 0, ∂ai k
i = 1, 2, ...
k
α and β are undetermined Lagrange multipliers. Insert Stirlings’ approximation: ⇒ − ln a∗i − α − 1 − βEi = 0
⇒
0
a∗i = e−α −βEi ,
α0 = α + 1.
1 −α−βEi e A For a system in temperature equilibrium with its environment at thermodynamic temperature T , the probability to find it in a state with energy Ei , falls off exponentially with that energy. Pi =
Note: this is per individual energy. But a typical system has more energy levels at higher energies ⇒ likely to find the system in some intermediate state. State averages: ¯ = M
X i∈All states
Pi Mi =
X
P (Ei )Mi =
i
X i
But, what are α and β? → story for next time. 11
e−α−βEi Mi
MIT 3.20, Rickard Armiento, Lecture 2
Suggested homework • • • •
McQuarrie 2.2 McQuarrie 2.1 McQuarrie 2.8 Exam problem: 2009E2 3(i).
12
MIT 3.20, Rickard Armiento, Lecture 3
Lecture 3: Identification of Beta and Entropy in the Canonical Ensemble • October 19, 2011, McQuarrie 2.2 - 2.3
The Story So Far Ensemble: collection of microstates that appear the same macroscopically / thermodynamically. ¯ = P Mi P i Ensemble averages: thermodynamical quantities are given as: M i Pi = probability of finding a system in microstate i of the ensemble. Microcanonical Ensemble: E, N , and V fixed ⇒ Pi = 1/Degeneracy = 1/Ω(E) = constant. Canonical Ensemble: N , V , and T fixed ⇒ Pi = (1/A )e−α−βEi , Ei = The energy of state i; α, β are so far unknown Lagrange multipliers that may depend on N, V, T .
Identifying α Just part of the normalization: Pi =
1 −α−βEi = e A
e−α A
e−βEi
Normalize: The probability for the system to be in any state = 1, solve for α.
1=
1 −α X −βEi e e A
⇒
eα =
i
1 X −βEi e A
e−βEi Pi = P −βE . i ie
⇒
i
This denominator will occur very frequently in our expressions, so define: The canonical partition function: Q=
X
e−βEi =
i∈All states
X
Ω(Ei )e−βEi =
i∈All energies
X
Ω(E)e−βE
E
Q is a “weighted count of all microstates” (more on this later.)
Check: canonical ensemble with just one energy Ei = E is a microcanonical ensemble: Pi =
e−βEi e−βE 1 =P = . −βE Q Ω(E) Ω(E)e E
13
MIT 3.20, Rickard Armiento, Lecture 3
The meaning and source of β Take a system with a specific set of energy states {Ei }. The average energy is set by β: P −βEi X X Ei e−βEi i Ei e ¯= P −βE 0 = P E Ei Pi = −βE i i ie i0 e i
i
Why is there a free parameter in the canonical ensemble probabilities that lets us choose the average energy? Step 1
Recall the derivation: the simulated heat bath was first in thermal equilibrium with its environment at T . Then, the system was isolated which resulted in a fixed total energy = E . This is the source of the freedom. E is directly related to the average energy (and thus β), X X ¯ E = ai Ei = A Pi Ei = A E i
1
1
2
3
1
Abstract canonical ensemble, all entries share the same N, V, T. All states NOT equally likely
Simulated heat bath of 𝒜 systems, 1. Bring to thermal equilibrium at T 2. Isolate ⇒ E=𝓔 , N=𝒩 , V=𝒱.
Identifying β Recall the “thermodynamic equation of state”: dE = T dS−pdV ⇒
=T N,T
∂S ∂V
−p N,T
∂V ∂V
⇒
N,T
∂E ∂V
−T N,T
∂p ∂T
= −p V,N
Idea: compose the same relation from ensemble averages, and β. Butwhat identify should be ¯ ∂p ∂E held constant? Try β. I.e., seek out relation between: 1) p¯, 2) ∂V , and 3) ∂β . N,β
V,N
Energy from ensemble average: (derived in previous section) ¯ {Measured energy} = E(N, V, β) =
P
)e−βEi (N,V ) . Q(N, V, β)
i Ei (N, V
(1) Pressure from ensemble average: A system in state j: dEi = −pi dV is the work done on the system for volume change dV , thus, we understand what pressure is for a single microstate: ∂Ei pi = − ∂V N Macroscopic pressure: P ∂Ei p¯ =
X
pi Pi = −
i
∂V
N
e−βEi (N,V )
Q(N, V, β)
i
14
3
...
1
2
... 1
3
2
1
A 2 N,V,T
3 ... A N,V,T
...
3
... A N,V,T
A N,V,T
All possible states of heat bath = microcanonical ensemble. Every entry has the same total E=𝓔 , N=𝒩 , V=𝒱. All states equally likely
i
∂E ∂V
2
A N,V,T
𝒜 N,V,T
At this point it may seem logical to solve for β(E ) and then seek E (T ). However, it turns out to be much better to directly seek β(T ).
3
2
...
MIT 3.20, Rickard Armiento, Lecture 3 (2) The energy derivative: ¯ n ∂E = Calculus: f (x) = ∂V N,β
g(x) h(x)
⇒ f 0 (x) =
h(x)g 0 (x)−h0 (x)g(x) h2 (x)
" ! X X ∂Ei 1 ∂Ei −βEi −βEi Q e − Ei β e − Q2 ∂V N ∂V N i
i
¯ ∂E ∂V
N,β
o
=
! !# X ∂Ei X −βEi −βEi − β e Ei e = ∂V N i
i
¯ p¯. = −¯ p + β(Ep) − β E
(3) The pressure derivative ∂p ¯ p¯ − (Ep) ⇒ = {Similar to energy derivative} = E ∂β N,V Assembling (1-3): ⇒
¯ ∂E ∂ p¯ +β = −¯ p ∂V N,β ∂β N,V
Compare ensemble averages vs. thermodynamics: ¯ ∂ p¯ ∂E +β = −¯ p ∂V N,β ∂β N,V
⇐⇒
∂E ∂V
−T N,T
∂p ∂T
= −p N,V
Wrong sign? How do we fix this? Are the more than one solution? ∂ p¯ ∂p dT dβ β = −T ⇒− = ⇒ ∂β N,V ∂T V,N T β − ln T = ln β + C ⇒ − ln T − ln k = ln β ⇒ ln
β=
1 , kT
e−Ei /(kT ) Pi = P −E /(kT ) , i ie
1 = ln β kT
k = constant.
What did we learn? In the derivation of the canonical ensemble, β appeared due to the freedom created by the total energy of the simulated heath bath being left undetermined, E . We now have the exact connection between the variable β and the thermodynamic temperature T . This defines temperature in the “microscostate picture”.
15
MIT 3.20, Rickard Armiento, Lecture 3
What is k in β = 1/(kT )? Canonical ensemble with pairs of systems: like the canonical ensemble, but every subsystem = two separate systems in thermal equilibrium. Possible states A and B, energy state occupations {ai }, {bi } ⇒ A ! B! W (a, b) = . Πi ai ! Πi bi ! Most probable distribution: A
B
e−βEi e−βEi Pij = P = PiA PiB . P A −βEi0 −βEiB0 i0 e i0 e From thermodynamics ⇒ the systems have the same T . From above ⇒ same β = 1/(kT ). ⇒ Two arbitrary systems in thermal equilibrium have the same k ⇒ k is a general numerical constant = kB . Take numerical value from any system, e.g., ideal gas; kB = 1.38 · 10−23 J/K = 8.6 · 10−5 eV/K
Identifying the Entropy Similar strategy as for β: use 2nd + 1st laws to express thermodynamical entropy in internal energy change + work: 1st law δQrev = T dS ⇒ dS = (1/T )δQrev = = (1/T )(dE − δWrev ) dE = δQrev + dWrev Corresponding quantities from canonical ensemble: Let the systems in the ensemble do some reversible work: infinitesimal change of state (all system in the ensemble change by dV ) along nice reversible path, in contact with heat bath so T = const. X X ∂Ei X ¯ δ Wrev = −¯ pdV = − Pi (pi )dV = Pi dV = Pi dEi . ∂V N i
i
i
P Recall: partition function: Q = i e−βEi = Q(N, V, T ) = “weighted sum over all states the system can occupy”. Microcanonical case {Ei } = 0 ⇒ Q = Ω(E0 ). What can we construct that looks like dS? Q is a weighted count of the microstates. Entropy is somehow connected to the number of microstates. But, entropy is additive and Q is multiplicative. Try: ln Q. P Define: f (β, {Ei }) = ln Q = ln( i e−βEi ) Total differential: df =
∂f ∂β
X ∂f dβ + dEi . ∂Ei β,Ej6=i Ei i
16
MIT 3.20, Rickard Armiento, Lecture 3 Note:
1. 2.
∂f ∂Ei
P − i Ei e−βEi ∂f ∂Q ¯ · = = −E = ∂Q ∂β Q Ei ∂f ∂Q βe−βEi = =− = −βPi ∂q ∂Ei Ei6=j Q β,Ej6=i
∂f ∂β
This gives ¯ −β df = −Edβ
X
Pi dEi ⇒
i
¯ = df + Edβ ¯ + βdE ¯⇒ d(f + β E) ¯ = β dE ¯ − δW ¯ rev ⇒ d(f + β E) Compare state averages and thermodynamics: ¯ = β dE ¯ − δW ¯ rev d(f + β E)
⇐⇒
dS =
1 (dE − δWrev ) T
We have found a way to define entropy on the microstate level of the theory, which is consistent with thermodynamics! ¯ = d(kB (f + β E)) ¯ ⇒ dS = kB d(f + β E) ¯ S = kB ln Q + E/T + constant Set constant = 0. (Only ∆S is relevant in thermodynamics, constant related to third law.) Entropy in canonical ensemble: S = kB ln Q +
¯ E T
What did we learn? We can understand what entropy is on the microstate level! Its main component is: ln of a weighted count of the number of microstates. S is not a mechanistic entity, there is no Si to be summed up for all states. S is a property of the whole probability distribution. Its meaning is roughly “how large is the space of microstates this system may be in, given its current thermodynamic properties”. (More on this next lecture.)
Entropy in the Microcanonical Ensemble Q=
P
ie
−βEi
= Ei = E = Ω(E)e−βE ⇒ S = kB ln(Ω(E)e−βE ) +
¯ E T
⇒
Entropy in the microcanonical ensemble: S = kB ln Ω,
(This is written on Boltzmann’s grave).
17
MIT 3.20, Rickard Armiento, Lecture 3
A Very General Definition of Entropy, Gibbs’ Entropy Formula S = −kB
P
i Pi ln Pi .
(see McQuarrie problem 2.5, hint: just plug in Pi = e−βEi /Q.)
Suggested Homework • • • •
McQuarrie 2.5 Exam problem: 2006E2-1(a) Exam problem: 2001E2-3(b) Exam problem: 2001E2-5(a-d) (we have not talked about alloys yet, but just use the entropy expression for the microcanonical ensemble and combinatorics.)
18
MIT 3.20, Rickard Armiento, Lecture 4
Lecture 4: Entropy Discussion. Physical Insights from Statistical Mechanics. • October 21, 2011, parts from McQuarre 1.1 - 2.4
Past Lectures Probabilities in the microcanonical ensemble: Pi = 1/Ω(E) = constant Canonical partition function: Q =
P
ie
−Ei /(kB T )
=
P
E
Ω(E)e−E/(kB T )
Probabilities in the canonical ensemble: Pi = e−Ei /(kB T ) /Q Entropy: S=
kB ln Ω (Microcanonical) ¯ (Canonical) kB ln Q + E T
= −kB
X
Pi ln Pi .
i
S is a property of the probability distribution of states, not a regular variable.
Interpretation of the Partition Function Partition function = “how the probabilities are partitioned among the microstates”. Typical for probabilities: Weight of one thing P = Summed weight of all things
Ensemble
⇐⇒ Pi =
e−Ei /(kB T ) , Q
Q=
X
e−Ei /(kB T ) .
i
This interpretation fits well, since we got the sum in Q from normalizing the probabilities in the nominator. Hence, Q is the “sum of all states multiplied by their weight” = “the size of the state space”. Also note: the “size of the state space” is directly controlled by T .
Solving for Microstates ˆ = EΨ ⇒ energy levels with degeneracy: Solve HΨ Count like this: Ψj,k , where j = energy state (0, 1, ...) and k counts the degenerate states for that energy k = 1, 2, ...Ω(Ej ). E0 : {Ψ0,1 , Ψ0,2 , Ψ0,3 , ..., Ψ0,Ω(E0 ) } E1 : {Ψ1,1 , Ψ1,2 , Ψ1,3 , ..., Ψ1,Ω(E1 ) } E2 : {Ψ2,1 , Ψ2,2 , Ψ2,3 , ..., Ψ2,Ω(E2 ) }
19
A collection of points (volumes) in phase space.
MIT 3.20, Rickard Armiento, Lecture 4 ... Probabilities Pi = Pj,k . Example from lecture 1: one independent particle: ω ∼ 1028 . 23 Many non-interacting particles: Ω ∼ 1010 .
Physics is independent of E0 Q=
X
e−βEi = e−βE0
X
i
e−β(Ei −E0 ) = e−βE0
e−β∆Ei ;
(∆Ei ≥ 0); ⇒
i
i
Pi = Pj,k =
X
e−βE0 e−β∆Ej e−β∆Ej P P = −β∆Ej e−βE0 j,k e−β∆Ej j,k e
This shows that the probabilities are independent of E0 , you can set your energy scale so that E0 = 0. A partition function with only one energy E = 0: X Q= e−β0 = Ω j=[0]
3rd Law in Statistical Mechanics; S → 0 as T → 0 (β → ∞) e−β∆Ej −β∆Ej j,k e
Pi = Pj,k = P
∆E0 = 0 (Groundstate); lim e−β∆E0 → 1. β→∞
∆Ei > 0 (Excited state); lim e−β∆Ej → 0. β→∞
⇒ lim P0 = lim P
β→∞
β→∞
e−β∆E0 1 = −β∆E j Ω(E0 ) j,k e e−β∆E0 =0 −β∆Ej j,k e
lim Pi6=0 = lim P
β→∞
β→∞
No ground state degeneracy: P0 = 1 Pi6=0 = 0
S = −kB
X
Pi ln Pi = 0.
Consistent with 3rd law.
i
Physics does not depend on absolute S, but the 3rd law sets S = 0 for T → 0, which for no ground state degeneracy is consistent with our choice of ‘constant = 0’ when we derived entropy; since ln 1 = 0.
20
MIT 3.20, Rickard Armiento, Lecture 4 With ground state degeneracy: 1 P0 = Ω(E 0) Pi6=0 = 0
S = −kB
X
Pi ln Pi = kB ln Ω(E0 )
i
Inconsistent with 3rd law? But: the ground state degeneracy is generally small compared to the huge numbers involved for larger T ⇒ 3rd law is valid in the macroscopic / thermodynamic limit. Typical example, Ω(E0 ) ∼ N , say 1023 ⇒ kB ln N = 1.38 · 10−23 ln(10)23 ∼ 10−21 J/K ≈ 0.
Infinite Temperature Limit, T → ∞, (β → 0) e−β∆Ei → 1 for all states, All states are completely accessible. Pi → constant, same for all states S = Maximal, state space is as large as it can be. This is a general observation: if all states are accessible and equally probable, S is maximal.
Entropy for Intermediate Temperatures 0 < T < ∞ Non-uniform distribution Pi . Intermediate entropy between S = 0 and S = Smax (possibly ∞). Probability distribution
Example: solid
Ordered
Liquid
Disordered
Eordered < Edisordered < Eliquid T low: Pordered > Pdisordered , Pliquid . System resides in “few” states, disorder energy-wise too expensive. T intermediate: The many more energy levels available. Disordered phase has higher energy but higher degeneracy. Pdisordered more significant. Possibly other ordered states. T high: Pliquid more significant. Ω(Eliq ) >> Ω(Eordered ). (More about this later, 2nd law spontaneous processes).
21
MIT 3.20, Rickard Armiento, Lecture 4
Interpretation of Entropy (Isolated, noninteracting particles)
The partition function ∼ size of state space. Temperature acts as a tuning knob that limits the state space. ∆Sworld increase ⇒ “nature wants to increase its state space” = heat transfers from one system to another if this increases the total state space. Why? Purely because of the statistics involved. All states are “possible”, but larger state space = more likely to find the system there. Huge numbers involved ⇒ laws of thermodynamics. The balance is controlled by the in state space size to a change in internal energy; response ∂S 1 ∂U V,N = T .
A
B Equal a priori probability:
PA = P B But of all possible degenerate states, How many "look like" A? Like B?
Pspread out >> Pon left side This specific hand: 2,598,960 : 1
This specific hand: 2,598,960:1
One pair 1.36 : 1
Royal straight flush 649,739 : 1
Connection to disorder: There are almost always more disordered states than ordered ones ⇒ disorder = high entropy, order = low entropy, i.e., the increase of state space usually means the world evolve from order → disorder. Arrow of time, direction of memory: with reversible equations of motions, why is time moving forward? ∆S > 0 and the idea that disorder increases breaks time symmetry.
Entropy in Information Theory Claude Shannon discussed the nature of “lost information” in communication signals.
Claude Shannon “My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.” M. Tribus, E.C. McIrvine, “Energy and information”, Scientific American, 224 (1971).
22
MIT 3.20, Rickard Armiento, Lecture 4
Entropy in Information Theory Entropy: a measure of uncertainty. Proportional to the number of optimal binary questions needed to exactly determine the state of a system. For the complete system: I
I
I
Only one state: S = 0, complete certainty, no questions needed. All probabilities equal: S=maximal uncertainty, maximal number of questions needed. Not all probabilities equal: S=intermediate, some questions needed to determine the state.
Entropy = “uncertainty”: proportional to the number of optimal binary questions one needs to ask to determine exactly which state a system is in. Eight Sided Dice
“Dumb” questions
Probability Pi
Probability Pi
1/8
1/8
1 2 3 4 5 6 7 8
1 2 3 4 5 6 7 8
Is it face 1? probability 1/8 that the answer is yes, 7/8 that it is no. Is it face 2? ...
Which side of the dice faces up? Ask binary questions = questions that can be answered with a yes or no. Asking one such question = one bit of information.
23
Not optimal questions, average number needed = 1 (1 + 2 + 3 + 4 + 5 + 6 + 7 + 7) = 4.375 8
MIT 3.20, Rickard Armiento, Lecture 4
Unequal probabilities Pi Probability Pi
“Clever” questions Probability Pi
2/8 1/8
1/8
1 2 3 4 5 6
1 2 3 4 5 6 7 8
123456
12345678 Question 1 1234
12
5678
1
Question 2 12
34
56
3456
2 34
78
56
Question 3 1
2
3
4
5
6
7
8
3
Number of binary optimal questions: 18 (3 × 8) = 3 Can get this as: log2 (8) = 3
4
5
Average number of questions = 2 × 28 + 2 × 28 + 3 × 18 + 3 × 18 + 3 ×
6
1 8
= 2.5
Communication Instead of a state, consider communication stream of bits.
General Case
What is the probability that the next bit is 0, 1?
Given a set of mutually exclusive possible states, i = 1, 2, ..., each with probability Pi known beforehand,
I I
Then:
I
The average number of optimal binary questions needed to find the state is X I=− Pi log2 Pi i
Suggested homework • 2002E2-2 • 2003E2-1a • 2007E2-5(a-c)
24
I
Only 0:s in stream, S = 0, complete certainty. Completely random: S=maximal uncertainty. P0 = 0.6, P1 = 0.4, S=intermediate. Some complicated function of previous bits (think: message in english), S=intermediate.
I=−
P
i
Pi log2 Pi = bits of “information” in the message.
MIT 3.20, Rickard Armiento, Lecture 5
Lecture 5: Relations with Thermo. General Boundary Conditions (the box) • October 24, 2011, McQuarrie 2.4 - 3.2
What We Know ¯ = P M i Pi Ensemble averages: Mmeausured = M i Pi = probability of finding a system in microstate i of the ensemble. Probabilities in the microcanonical ensemble: Pi = 1/Ω(E) = constant Entropy in the microcanonical ensemble: S = kB ln Ω(E) P P Canonical partition function: Q = i e−Ei /(kB T ) = E Ω(E)e−E/(kB T ) Probabilities in the canonical ensemble: Pi = e−Ei /(kB T ) /Q ¯ Entropy in the canonical ensemble: S = kB ln Q + E T
Completing the Connection to Thermodynamics Let T, V, N be constant: canonical ensemble. Helmholtz Free energy for the canonical ensemble: the thermodynamic potential with the same natural variables as the basis for the canonical ensemble: n o ¯ E ¯ A = E − T S = S = kB ln Q + E = E − T k ln Q + T ⇒ B T T Fundamental relation in natural variables: A(T, V, N ) = −kB T ln Q(T, V, N )
Connects the ‘microstate world’ (Q) and the macroscopic thermodynamical world (A). All other relations follow from this one + thermodynamics: Deriving equations of state: ∂A S=− ∂T V,N ∂A p=− ∂V T,N ∂A µ=+ ∂N T,V
∂ ln Q ∂T
S = kB ln Q + kB T ∂ ln Q Eqns. of state ⇒ p = kB T T,N ∂V ∂ ln Q µ = −kB T ∂N T,V
25
V,N
MIT 3.20, Rickard Armiento, Lecture 5 It works equally well to just solve for the quantities: ¯ E E A = E − T S ⇒ S = (E − A)/T = A = −kB T ln Q = + kB ln Q = + kB ln Q. T T The energy: E = A + T S = −kB T ln Q + T
kB ln Q + kB T
∂ ln Q ∂T
!
= kB T
2
V,N
∂ ln Q ∂T
V,N
Another useful formula for the energy: ) ( ∂Q P P −βEi −βEi = − E e E e ∂ ln Q i i i ∂β i ¯ ⇒ E=− E= ⇒ . d ln(f (x)) f 0 (x) Q ∂β V,N dx f (x) = The partition function Q ⇒ everything (in the thermodynamical description of the system.)
Example N distinguishable systems that can be in 2 energy states = 0, . j = nj ,
⇒
nj = [0, 1],
E=
N X
nj ,
ρ=
j=1
X
nj .
j
N
e−β
X
Q=
P
j
nj
X
Y
{nj =1,0}N j
j
=
{nj =1,0}N j
e−βnj =
Y X
X
e−βnj =
j nj =1,0
e−βn
= (1+e−β )N
n=1,0
(General result we will see later, Q = q N for identical independent distinguishable systems.) A = −kB T ln Q = −kB T ln(1 + e−β )N = −N kB T ln(1 + e−β ).
A = −kB T ln(2) ≈ −0.69kB T N A For T > 1 ⇒ = 0. N
For T >> 1, β > number of particles. We can use the Boltzmann distribution! The degrees of freedom decouple (independent sum of Hamiltonians, same proof as for independent particles): qatom = qtranslational · qelectronic · qnuclear . Nuclear energy levels are far apart (∼ 1 MeV), atoms stay at nuclear ground state at normal temperatures, we can set qnucl = 1. Commonly electronic states have ∆ ∼ 1eV and thus e−β is small (∼ 10−17 ), and thus we can set qelec = 1. This is not true for all atoms (e.g. halogen atoms). McQuarrie discusses some cases. Partition function for particle in a box qatom ≈ qtrans =
X
e−βl ;
l = nx ,ny ,nz =
l
h2 (n2x + n2y + n2z ). 2 8ma | {z } =1/M
40
MIT 3.20, Rickard Armiento, Lecture 7
qtrans =
∞ X ∞ X ∞ X
−β(n2x +n2y +n2z )/M
e
=
nx =1 ny =1 nz =1
x, y, z independent = n = nx , ny , nz
∞ X
!3 −βn2 /M
e
=
n=1
β/M ∼ 10−21 ⇒ MANY states, terms slowly changing with n. = = Take continous limit s !3 Z ∞ 3 n o R p 2πmkB T 3/2 πM 2 2 ∞ −cx −βn /M π = V = e dn = 0 e dx = c = β h2 0
Alternative way, using the density of states ω() in the continuous limit from lecture 1: Z ∞ Z ∞ X π 3/2 1/2 −β ω()e−β = ω()e−β d = ω() from 1st lecture = qtrans = M e d = ... = same thing 4 0 0 Canonical partition function for the monoatomic ideal gas (with no electronic excitations) qatom ≈ qtrans =
2πmkB T h2
3/2 V
qN 1 Q = trans = N! N!
V = 3, Λ
2πmkB T h2
Λ=
3N/2
h2 2πmkB T
1/2
VN
The length Λ is a “thermal De Broglie wavelength”. Criterion for Boltzmann statistics becomes Λ3 /V is small, i.e., Λ < dimensions of the container. (See McQuarrie, end of section 5-1). Thermodynamic connection: A = −kB T ln Q ⇒ A = −kB T ln =
qN N!
= −N kB T ln q + kB T ln N ! =
Stirling’s approximation = −N kB T ln q + N kB T ln N − N kB T ⇒ ln n! ≈ n ln n − n " # 2πmkB T 3/2 V A = −N kB T ln − N kB T. h2 N
Step 1, the pressure: dA = −SdT − pdV + µdN
⇒
p=−
41
∂A ∂V
N,T
∂ = N kB T ln V ⇒ ∂V N,T
MIT 3.20, Rickard Armiento, Lecture 7 Pressure for the ideal gas (equation of state 1) p=
N kB T V
Step 2, the energy: A = E − T S, dA = −SdT − pdV + µdN ⇒ ∂A ∂ ln Q S=− + kB ln Q ⇒ = A = −kB T ln Q = kB T ∂T V,N ∂T 3N/2 2 ∂ ln Q 2 ∂ ln T = E = A + T S = kB T = kB T ∂T ∂T N,V N,V
Energy in the ideal gas (2nd equation of state) 3 E = N kB T ; 2
The other equation of state valid for a monoatomic ideal gas!
Step 3, the chemical potential µ=
= −kB T ln
"
∂A ∂N
"
2πmkB T h2
= −kB T ln T,p
3/2
V N
2πmkB T h2
# =
n
V N
=
#
3/2
kB T p
+ kB T ln N +
V
o
" = −kB T ln |
N kB T − kB T = N
2πmkB T h2 {z µ0
3/2
# kB T +kB T ln p = µ0 + kB T ln p. }
Suggested homework • • • •
McQuarrie problem 4-12 Exam problem 2003E2-5 Exam problem 2001E2-4(a) Go through the derivation for an ideal gas in the lecture, and check what changes if we had set qelec = ωe1 + ωe2 e−β∆12 . (Check McQuarrie 5-3, if you need help.)
42
MIT 3.20, Rickard Armiento, Lecture 8
Lecture 8: Ideal diatomic gas. Rotational and vibrational DOF • October 31, 2011, McQuarrie 6
Useful Formulas ¯ = P P i Mi Averages: Mmeasured = 1st postulate = M i MicrocanonicalP ensemble: Pi = 1/Ω Ω(N, V, E) = i = Degeneracy for energy E Easiest connection to thermodynamics: S = kB ln Ω(N, V, E) Canonical ensemble: Pi = e−Ei /(kB T ) /Q P P Q(N, V, T ) = E Ω(E)e−E/(kB T ) = i e−Ei (V )/(kB T ) Easiest connection to thermodynamics: A = −kB T ln Q(N, V, T ) Grand canonical ensemble: Pi = e(−Ei +µN )/(kB T ) /Ξ P P P Ξ(V, T, µ) = N Q(N, V, T )eµN/(kB T ) = N i e−EN,i (V )/(kB T )+µN/(kB T ) Easiest connection to thermodynamics: Φ = −kB T ln Ξ(V, T, µ) In large ‘thermodynamical’ systems → all ensembles give the same result. Independent distinguishable particles: Q = qA · qB · qC · ... = {if identical} = q N Independent indistinguishable particles in high-temperature low-density limit: Q = q N /N ! Diatomic Degrees of Freedom
Diatomic Molecule Assume the degrees of freedom are independent. Requires the BornOppenheimer approximation: electrons reach equilibrium on much shorter timescale than ionic motion. Also assume amplitude and timescale of vibrations are so small they do not interfere with rotations (= “rigid-rotor approximation”).
2) ROT
ˆ single particle = H ˆ trans + H
ˆ rot ˆ vib + H ˆ ˆ H +H +H |{z} | elec {z nucl} Rigid-rotor For now, assume approx = fixed r big energy spacing.
Q=
N qsingle particle
← Assume normal conditions, Boltzmann statistics ok. N! qsingle particle = qtrans · qrot · qvib · qelec · qnucl | {z } we can set = 1
43
3) VIB 1) TRANS
MIT 3.20, Rickard Armiento, Lecture 8
The Two-Body Problem Two bodies in a potential that only depends on their distance. Energy in classical physics: 1 1 E = m1 r˙ 21 + m2 r˙ 22 + v(|r2 − r1 |) 2 2 Reorganize to get: 1 ˙ 2 + 1 m1 m2 r˙ 2 + v(|r|), E = (m1 + m2 )R 2 | {z } 2 m1 + m2 | {z } m
R=
m1 r1 + m2 r2 , m1 + m2
r = r2 − r1 .
µ
Hence, the problem is equivalent to one of a free translational motion of the center of mass, R, using the sum of the masses, m = m1 + m2 , plus the independent movement, r, of one ‘virtual’ particle with mass µ in the potential v. In our quantum mechanical formulation: (m1 +m2 ) (µ) ˆ =H ˆ trans ˆ rot ˆ (µ) H +H +H vib
Translations Easy! Same as we derived for monoatomic ideal gas, but m = m1 + m2 : qtrans =
2π(m + m )k T 3/2 V 1 2 B V = 3 2 h Λ
3 Etrans = N kB T 2
⇒
Vibrations Harmonic approximation (small vibrations) = harmonic oscillator. First term in Taylor expansion of the potential: ∂v 1 ∂ 2 v + v(z) = v(0) (r−re )+ (r−re )2 +... |{z} ∂r r=re 2 ∂r2 r=re | {z } | {z } set energy scale so this = 0
F
= 0 at equilibrium
Harmonic oscillator with reduced mass: 2 2 ˆ vib = − ¯h ∂ + 1 F r2 ; H 2µ ∂r2 2
reduced mass:µ =
m1 m2 m1 + m2 From: Wikipedia (Mark Somoza, 26 2006)
Harmonic oscillator solution (see lecture 1): 1 n = (n + ) h ν; n = 0, 1, 2...; 2 ↑ Note: not ¯ h qvib =
X n
−βn
e
=
∞ X
e
−β(n+1/2)hν
=e
1 ν= 2π
−βhν/2
n=0
∞ X n=0
Introduce “vibrational temperature” Θv = hν/kB ⇒
44
s
e
F . µ
−βhν
n
=
Geometric series P ∞ 1 n n=0 x = 1−x
=
e−βhν/2 . 1 − e−βhν
MIT 3.20, Rickard Armiento, Lecture 8 Vibrational partition function qvib =
e−Θv /(2T ) 1 − e−Θv /T
“Number of thermally available vibrational states.”
From: McQuarrie, Statistical Mechanics, Table 6-1.
vib
e−βn Pn = ; qvib
Pn>0 = 1 − P0 = e−Θv /T
N2 : Θv = 3374 K, T = 300 K ⇒ Pn>0 ≈ 1.3 · 10−5 K2 : Θv = 133 K, T = 300 K ⇒ Pn>0 ≈ 0.64. Only first, or first few states are excited in room temperature. The vibrational energy: N ln(N !)] 2 ∂ ln(qvib /N !) 2 ∂[ln(qvib − Evib,ν = kB T = N kB T = ∂T ∂T V,N V,N ∂ Θv Θν −Θν /(2T ) −Θv /T = N kB T ln e − ln 1 − e ⇒ Evib,ν = N kB + Θ /T ∂T V,N 2 e v −1 2
N2 , T = 300 K ⇒ Evib,ν /N ≈ 0.15 eV/molecule K2 , T = 300 K ⇒ Evib,ν /N ≈ 0.03 eV/molecule Compare: chemical bonds ∼ 1 eV.
45
MIT 3.20, Rickard Armiento, Lecture 8
Rotations Hamiltonian for rigid rotor 2 2 h ¯ 1 ∂ ∂ 1 ∂ ˆ =− H sin θ + 2I sin θ ∂θ ∂θ sin2 θ ∂φ2
(in spherical coordinates.)
where I = moment of intertia = µR02 ⇒ J =
J(J + 1)h2 ; 8π 2 I
J = 0, 1, 2, ...;
ω() = (2J + 1)
The energy scale is set by h2 /(8π 2 I). Typical values → 10−4 eV = microwave region. (Which is how a microwave oven works.) Note: ∼ 100 times as tight energy spacing as vibrational spectra.
Rotations in Unsymmetrical Molecule (e.g., NO) Usual way of forming the partition function: X X 2 2 qrot = ω()e−β = (2J + 1)e−β(J+1)h /(8π I)
J
As for vibrations: define characteristic rotational temperature Θr = qrot =
X
h2 8π 2 IkB
⇒
(2J + 1)e−(J+1)Θr /T
J
Will look at T > Θr and T < Θr separately: 1) If T >> Θr , then we can handle this sum similar to how we handled the translational freedom of the monoatomic gas in lecture 7 = many, many terms which are very similar → continuous limit: 0 Z ∞ Z ∞ T 0 J = J(J + 1) −β(J+1)Θr /T ⇒ qrot = (2J + 1)e dJ = dJ = e−J Θr /T dJ 0 = Θr 0 0 dJ 0 = 2J + 1 Sum → Integral can be seen as first term in a Euler-Maclaurin expansion (see McQuarrie 6-3), which gives: Rotational partition function for unsymmetrical diatomic molecule and T > Θr ! T 1 Θr 1 Θr 2 qrot = 1+ + + ... Θr 3 T 15 T
46
MIT 3.20, Rickard Armiento, Lecture 8 Erot,ν = kB T
2
∂ ln(qrot /N !)N ∂T
= N kB T + ... V,N
2) On the other hand, if T is small or Θr is large, the terms drop off quickly → just cut the sum after enough terms. (Mostly relevant at very low temperature, and for hydrogen at low temperatures). Rotational partition function for unsymmetrical diatomic molecule and T < Θr qrot =
X
(2J + 1)e−(J+1)Θr /T = 1 + 3e−2Θr /T + 5e−6Θr /T + ...
J
Rotations in Symmetric Molecule (e.g., H2 ) Indistinguishability again; exchanging the two nuclei requires symmetric or antisymmetric wavefunction. Requires careful QM treatment (see McQuarrie 6-4,6-5). However, for high temperatures it works out to just a factor 1/2, roughly meaning that there are half as many states due to inverted states being identical. Rotational partition function for symmetric diatomic molecule and T > Θr ! T 1 Θr 1 Θr 2 qrot = 1+ + + ... 2Θr 3 T 15 T
Probabilities to find the system in a state with J = J (for both symmetric and unsymmetrical case): PJ =
(2J + 1)e−J(J+1)Θr /T → Has an intermediate maximum, Jmax qrot
Assembling the Full Partition Function From: McQuarrie, Statistical Mechanics, Table 6-1.
(qtrans qrot qvib )N Q= ; N!
(Boltzmann statistics → decoupled degrees of freedom)
Thermodynamic connection: A = −kB T ln Q = −N kB T (ln qtrans +ln qrot +ln qvib )+kB T ln N !) = Atrans +Arot +Avib +kB T ln N !. Equation of state from pressure: p=−
∂A ∂V
N,T
N kB T only qtrans depends = = ; on volume V
47
Same as monoatomic!
MIT 3.20, Rickard Armiento, Lecture 8 Energy: E = kT 2 3 Etrans = N kB T ; 2
Erot =
∂ ln Q ∂T
= Etrans + Erot + Evib N,V
T >> Θr : N kB T ; otherwise: more complicated
Evib = N kB
Θv Θv + Θ /T 2 e v −1
.
Heat capacity: CV =
∂E ∂T
=
V,N
∂Etrans + Erot + Evib ∂T
= CV,trans + CV,rot + CV,vib V,N
3 CV,trans = N kB , 2
T >> Θr : N kB otherwise: more complicated 2 eΘv /T Θv T >> Θv : N kB ← from taylor exp.: eΘv /T = 1 + Θv /T + ... = = N kB T