This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

来自 University of Minnesota 的课程

分子热力学统计

146 个评分

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

从本节课中

Module 7

This module is relatively light, so if you've fallen a bit behind, you will possibly have the opportunity to catch up again. We examine the concept of the standard entropy made possible by the Third Law of Thermodynamics. The measurement of Third Law entropies from constant pressure heat capacities is explained and is compared for gases to values computed directly from molecular partition functions. The additivity of standard entropies is exploited to compute entropic changes for general chemical changes. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

We've already worked extensively with entropy and the partition function, but

now I want to continue our exploration by looking at what happens near 0 Kelvin.

And so I'll recall for you that the entropy can be expressed as Boltzmann's

constant times the log of the partition function.

Plus kT times the partial derivative of the log of the partition function, with

respect to temperature, holding number of particles and volume constant.

And now, let's actually expand Q, and remember what the partition function is.

It's the sum over all states, exponential minus the energy of that state, divided

by Boltzmann's constant times temperature.

So, I've swapped that in for Q here in the first term, and I've swapped it in

for the partial log of Q. I've actually carried out the

differentiation, and so I get a 1 over T term, and I get this expression and I

take the derivative. We've taken this derivative multiple

times because T appears here, I'll pull down the E sub j over k.

That'll cancel k over k, I'm left with this E sub j term, and I end up also with

the partition function again. So, I'll let you revisit that derivative

if you'd like to, but it's one we've taken before in the past.

What I want to do here, is explore the question of, how does this expression

behave as the temperature goes to 0? Which is to say, is the statistical

thermodynamic definition of entropy consistent with the third law of

thermodynamics? Well, so let's look at the partition

function as temperature goes to 0, the low temperature partition function.

And, to make things a little more convenient, let me work with levels,

instead of states. Just to remember what that means is, I'm

now going to switch my index, over which my sum is running.

I'm going to run over energy levels, so all levels that have the same energy.

The degeneracy, which expresses how many levels have the same energy g, e to the

minus the energy of that level over kT. And let me in fact then pull out from

this expression e to the minus ground state energy, E0 over kT.

So, I'll take that out as a constant, it's just some number, it's multiplying

this sum and so, it's the degeneracy of a given state.

E to the minus the difference between that state, and the ground state energy.

Right? And so here I have e to the minus E0 over

kT. Here I have E to the minus and minus is

plus E0 over kT so I've sort of multiplied by 1.

And turned my 1 into something inside and outside the summation.

But the utility of that is, as the temperature goes to 0, the argument of

this exponential is divided by something going to 0.

So, it's becoming e to the minus a very, very large number since I'm dividing by

something going to zero. So it's going to 0 itself so for every

level other than n all those terms go to 0.

And I will be left with in that case, simply the limit as T goes to 0 of the

partition function is the degeneracy of the ground state times e to the minus E0

over k T. So, given that that's the low temperature

behavior of the partition function, let me then return to my general expression,

here. Apply that same sort of analysis to all

these sums over exponentials. And I'll end up with then, this sum is

replaced by g0, e to the minus E0. This sum is also replaced by that, of

course, it was being multiplied times the state energy, so the only state that

survives is the ground state, so that E0 sticks around.

And this is again, just Q re-expressed. And so now if I look at these in general,

I have a log of a product. So I will get k log degeneracy.

K log of an exponential. So that just becomes the argument of the

exponential. So I'll get E0, negative E0 over kT, by

multiplying by Boltzmann's constant. So that's just minus E0 over T.

And now if I go and look at what survived over here.

G0, g0, that cancels. E to the minus E0 over kT, in numerator

and denominator. That cancels.

The only thing left is E0 over T. So, these two terms are equal and

opposite to one another. Finally, all that is left is k log

degeneracy of the ground-state. Alright, and, therefore, as the

temperature goes to 0, the entropy determined from the partition function,

also goes to 0, to within the ground-state degeneracy.

So, let's really do this. Let's use all these tools to do a first

principles computation of entropy, a third law entropy.

All right, and we're going to do it for, well I'll say what we'll do it for in a

moment. First, let's just recapitulate the

relevant equations. Here is the entropy expressed in terms of

the partition function. And lets use an ideal gas partition

function which allows us to express the total partition function as a molecular

partition function to the nth power over n factorial.

And particular lets do a diatomic ideal gas and so you can go look this equation

up again if you want to in video 4.6. But it has a translational component to

the partition function, a rotational, a vibrational, and an electronic.

And what we need to do, and this is a wonderful math exercise, is take this q,

this little q, plug it in here to the nth power over N factorial, in order to get

capital Q. Take its partial derivative with respect

to T and then also just multiply the log of it by Boltzmann constant, and so I

have here a great math exercise. I'm not actually going to do it term by

term here. It's a lovely thing to sit down at a

table with and try to verify what I'm about to show you, but it's just

straightforward differentiation. Here's the final expression, just barely

fits on the slide. So, I'm going to use N is equal to

Avogadro's number, so that I have the molar entropy.

And I'm going to divide by R, because all these terms multiply R.

So, just for simplicity's sake, I'll put the R over here on the left hand side.

And so I get this term log of things that look like they might be associated with

the translational partition function. Something that involves the rotational

temperature, the vibrational temperature, another term in vibrational temperature.

The ground state degeneracy of the electronic state, and you also see

lurking in here some e's, so that really the number e.

And if you think about that, if I take the log of e to the five halves, that

just gives five halves and this is in units of R.

So, somewhere in here is a five halves R contribution, and you might remember that

entropy has say a half an R from translation in all directions.

And you get an R out of Sterling's approximation, so as I said, I'll let you

verify this expression as a good side exercise if you'd like.

but really what I'd like to do in terms of a little self assessment here is

looking at this expression, try to do a little bit of a sanity check of when

might you expect entropy to be large. Relative to of one molecule relative to

another, who would have more or less entropy so let me give you a moment to

think about that and then we'll come back.

All right, hopefully the sanity check all made sense and now let's take a closer

look at this expression for the molar entropy of a diatomic ideal gas.

So, this first term contains all of the contribution from translation.

And it also, by convention, if you had actually worked out the full computation

of entropy. You'd have had a Stirling's approximation

term that came from the log of n factorial, and by convention, that gets

lumped into the translation piece as well.

There's a term deriving from the rotations of the diatomic molecule, two

terms that derive from the vibration. There's only one vibration in a diatomic,

and finally there's an electronic term. And so what do we need in order to

actually compute the entropy? Well, this is nitrogen, nitrogen gas.

And so, appearing in this first expression are the masses of the two

atoms in the diatomic. And if we were doing 2 nitrogen 14

isotopes, they'd, m1 would be equal to m2.

So it's about 28 atomic mass units. A tiny bit different, because the atomic

mass unit is defined for carbon 12, but close enough.

You can go look up the exact number in kilograms if you want to.

The symmetry number appears in the rotational term and its two for a

homo-nuclear diatomic. And we need the rotational temperature

and if you go look that up after we had it on previous slides as well, its 2.88

Kelvin. In the next two terms the only thing we

need to know is the vibrational temperature.

And again that's something we've had on prior videos as well as you can find it

in tables, 3374 Kelvin. And lastly, you need to know the ground

electronics state degeneracy and its 1 for nitrogen, its not a degenerate state,

it's just non-degenerate. So, if I take all these values and plug

them in, and everything else up here is a constant, Boltzmann's constant, pi,

Planck's constant, Avogadro's number. By convention we choose a molar volume at

which we're tabulating things, and that's the volume of an ideal gas, say, at room

temperature. So, plug all those numbers in, and, you

will find that I'll, I'll, just give the numbers by piece, 150.4 Joules per Kelvin

per mole, deriving from this first term. Translation, 41.13 from rotation, 1.15

times 10 to the minus third. So, that's really quite small compared to

these other two terms. From vibration, none at all from the

electronic degeneracy. Because really, there's no disorder

associated with the with the electronic state it's in exactly one state, the

ground state. And so there's no entropy, because

there's no disorder. Similarly, this vibrational temperature

is so high, 3,374 Kelvin. They are effectively, all the vibrations

are in the ground state. So, again not much disorder there and it

only contributes a thousandth of a joule per Kelvin per mole.

But add them all together and you get that the standard entropy is 191.5 joules

per Kelvin per mole and I should but a bar over this, this really is the molar

entropy. And I'll just remind you that if you go

back to video 7.3, where I showed an example of using.

or at least, tabulated experimental data that would have come from heat capacity

measurements. The value that was determined there was

191.6 joules, per kelvin, per mole. So, within 0.1 joule per kelvin per mole,

that is quantitative agreement. So, I think this is an amazing result to

some extent. It is an incredibly powerful example of

just how much insight we gain into the behavior of a macroscopic gas.

Because we understand its molecular properties.

By plugging those molecular properties into an expression that depended on them,

we were able to compute the entropy effectively exactly.

And the important, maybe even beyond that, we know where that entropy comes

from. We know that more than almost three

quarters I'd say comes from the translation.

And most of the rest comes from the rotation, and that I think is a

fascinating molecular level of insight that comes from statistical molecular

thermodynamics. Alright, well that was a single example,

I'd like to now expand and do some additional considerations of third-law

entropies in general. So we'll get to that next.

[SOUND]