This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

4.9（175 个评分）

- 5 stars159 ratings
- 4 stars15 ratings
- 3 stars1 ratings

Sep 15, 2018

This is a very well designed thermodynamics course. I'm a Chemical Engineer and i am glad to have a new point of view of my daily rutine. 100% recomended to my collegues

Aug 30, 2018

Interesting and challenging. It brings together some Stats, Chemistry and Physics. The homework was thought provoking and informative. I learned a great deal.

从本节课中

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant β in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

#### Dr. Christopher J. Cramer

Distinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Let's try computing some entropy changes. And we'll look at two systems we've seen

before, and a new one as well. So, let me start with the ideal gas

expanding into a vacuum. That is not a reversible process.

But because entropy is a state function, the change in entropy does not depend on

the path. And so if we're interested in the entropy

change for this irreversible process, we can compute it if we can compute the

entropy change for the reversible process.

That is, If I look at delq reversible over T, as I integrate from state point

one from state point two, that will be the entropy change for the irreversible

process as well. Because it only depends on the state

points, one and two. So, what is the change in heat, delq rev,

it's equal to dU minus the change in work, so that's the first law.

Now, this process was isothermal, we had isolated the system from the surroundings

when we talk about the irreversible process.

It doesn't have to be isothermal in the reversible case, in fact, we'll see it,

it can't be. But that's okay, we want to maintain

however, del dU is equal to zero. So given, that the change in internal

energy is equal to zero for isothermal, because an ideal gas is energy only

depends on temperature. That means that the reversible heat

change must be equal to minus the reversible work.

We know for an ideal gas what the reversible work is, minus nRT over V, dV.

And so when I go and plug all that in, so I want to know the integral from one to

two of delq rev over T. Well, that's minus the integral from one

to two of delw rev over T. Here's my expression, so the Ts drop out.

So, when I talk about going from one to two, I'm actually talking about the

change in volume. That's what's left in this integral in dV

over V and I get nR log V2 over V1. So, we've actually seen that expression

before for the expansion of an ideal gas. And you might think to yourself, well,

what was the difference between the irreversible process and the reversible

process? This is often a point of confusion.

And the difference is, what happened to the surroundings?

So, in the reversible case in order for the system to be isothermal, heat had to

be added to the system. Work is being done during the reversible

expansion. The pressure is growing, growing, growing

inside that other vessel, as I'm allowing myself to expand into it.

And so, to drive that example, if you will, heat must flow into the system.

It's done infinitesimally slowly, so you could imagine that piston with sand on it

again, for example. ne, never the, and that's why the

external pressure is equal to the ideal gas pressure.

But, okay that's an imaginable experiment.

And so, if this was the reversible work it's also the reversible heat, there's a

sign change in there. So, the gas absorbs heat from the

surroundings and the entropy of the surroundings must decrease as a result,

right? Because for the surroundings, there is

negative heat. It is giving up heat to the, the system,

which is receiving it, receiving it in a positive quantity.

So, delS surroundings is minus q reversible over T.

This is reversible, it must be equal and opposite, so I get minus nR log V2 over

V1. So, the total change in entropy of the

system plus the surroundings? Zero.

Which is must be it was a reversible process, no change in entropy.

No change in total entropy, always expected for a reversible process.

Focusing now on the irreversible case, when I open that stopcock all at once and

it's isothermal, there is no external pressure.

So, the irreversible work, zero, and the system was isolated, so the irreversible

heat flow, zero. Since there's no exchange of heat with

the surroundings, there's no change in entropy for the surroundings.

The surroundings doesn't even know what happened it's, that system was isolated

from the surroundings. And so the net entropy change, system

plus surroundings, well, the system, it was a state function, we already know

what happened in the system. NR log V2 over V1, but surrounding zero.

So, for the universe then of system plus surroundings, there has been a net

increase in entropy. And that's consistent with our statement

of the second law, the reversal process, there was no change.

The irreversible process there was a net increase in universal entropy.

So, total entropy increases as expected, irreversible process.

I will add as a sort of a technical note, we used the irreversible heat to compute

the change in entropy of the surroundings.

I mean in a sense it was, it was zero so maybe that doesn't seem very unnatural

but typically you, you can't do that. We get away with it in this case because

there is zero work and so from a technical standpoint, that made the heat

a state function. We didn't have to worry about whether it

was reversible, irreversible. A, the fact that it was zero is sort of a

benefit as well. But that's, you know, something that's

good to born in mind if your really looking at the nitty gritty of the thermo

dynamics. Okay, let's take a moment and I will let

you, a, answer a question on this front and then we'll move on to consider

entropy of mixing. Let's look at the second example we've

already considered, and that is two vessels with different gasses that are

then opened one to another. So, for each of the two gases, I've, I've

only got two in this instance, but if I were to generalize.

And have i different gasses and all allowed to enter one another's volume at

a given point. What we've already worked out, again and

again, what the entropy change is, as I expand from a given volume to a different

volume. It is, for each individual gas, the

number of moles of that gas times R times the log sum over all the volumes that are

now accessible. And maybe there's j different flasks that

are all interconnected, divided by the original volume.

And if I, for purposes that'll become apparent in a moment, if I want to put a

minus symbol out front, I can put this sum in the bottom.

So, all I've done is inverted the argument and hence, change the sign of

the logarithm. However, for ideal gases all at the same

temperature, the volume is actually proportional to the number of moles.

And so, I can replace where volume appears with number of moles.

So, I get the change in entropy for a given gas i is minus the number of moles

of i R log number of moles of i divided by the total number of moles in the

system. And that's sometimes referred to as the

mole fraction. So, I'm going to indicate that by yi, yi

is number of moles divided by total number of moles spoken as mole fraction.

So this entroy of mixing is minus R n sub i log y sub i.

So notice, that the mole fraction as long as there's more than one gas, is always

less than 1. So, the log of a number less than 1 is

always a negative number. I'm preceded by a negative sign.

Number of moles is positive, R is positive, and so the entropy of mixing is

always positive, mixing is always spontaneous.

Let's do one last example, here's one we haven't seen before.

Imagine that I have two identical pieces of a metal bar, common material.

Maybe it's copper, maybe it's manganese, pick your favorite metal.

They're at different temperatures, hot and cold, represented brilliantly here

with a red piece and a blue piece. So, the red one is hot, it's a Th, and

the blue one is cold, it's a Tc. And I bring them together, I touch them.

So, we know that heat will flow between those two systems.

It's a bit like our isolated compartments we did previously, but now this is more

practical, two metal bars. There will be almost no change in the

volume of these bars as long as I don't have, you know, thousand of degree

changes. So, the work change is negligible because

delta V is negligible. And so the reversible heat change is

equal to dU, it's equal to just dq because given no work, q becomes a state

function. We can talk about reversible or

irreversible, it doesn't matter. There's no work, so you don;t have to

worry about a path for q anymore. And it's equal to the heat capacity times

dT. So, if, just to make the math a little

bit more convenient here. Let's assume that the heat capacity, over

the temperature range we're interested in, is independent of temperature.

It takes the same amount of heat to go up one degree, the next degree, the next

degree after that, it's just a constant heat capacity.

So in that case, delta q, the total heat transfer, is going to be the heat

capacity times the, initial temperature minus the final temperature.

All right? Whatever temperature I, I finish up at.

Moreover, the heat lost has to be equal to the heat gained.

So, CV times the hot temperature minus the final temperature, must be equal to

CV times the cold temperature minus the final temperature.

And that means that the final temperature here, is not a rocket science equation.

It means that it's gotta be the average of the original two temperatures and I

suspect there, there might be a sign error here.

I probably want to switch this around and make this negative, one is heat in, one

is heat out. But I, I think the final conclusion that

the average heat is the final temperature the, the average temperature, is the

final temperature, is a conclusion that seems obvious.

So then, what is the change in entropy? Well, it is for each of these bars, the

integral from the initial temperature, to the final temperature.

Irrespective of whether that initial temperature is the hot or the cold dq

over T. I don't have to write reversible here

because heat is a state function in this case, there's no work.

So, I now replace dq with Cv dT and I end up with the integral from initial to

final temperature of dT over T. So that Cv log, final temperature over

initial temperature. Okay, so, let's keep working with that

and think about now rod by rod. So, what is the, this integral in the

case of the cold rod. Well the final temperature is Tc plus Th

over two. The initial temperature is the cold

temperature, Tc, and so delta S is Cv times log of this quantity.

Now let's do the hot rod. Well, in the hot rod case, we don't mean

a very fast car in this instance, by the way, old use of hot rod it's Tc plus Th

over two is the final temperature. The original temperature is Th, so these

two look the same, except one has a c in the bottom and one has an h in the

bottom. And so the total entropy is the sum of

the two of these. A sum of logarithms is like a log of a

product. And so if I take the product of the

numerators I get a Tc plus Th quantity squared.

If I take the product of the denominators I get four Tc, Th.

And I can ask now, will this be spontaneous?

It will be spontaneous if delta S is greater than zero.

Well, how can I establish that? Here's a little proof that this is

greater than one, right? And in order to prove that I need the

numerator to be great than the denominator.

If that's always true, then I have log of a number greater than one, delta S will

be greater than zero. The heat flow will be spontaneous.

So how can I prove that? Here's a quick little proof.

Consider T C minus T H squared. Well, since that's a number squared it's

greater than zero unless they start at the same temperature.

But then I don't worry about heat flow I expand this squared quantity, Tc squared

minus twice the cross term plus Th squared is greater than zero.

Add to both sides 4Tc Th. So that means minus 2 goes to plus 2, and

this side ends up with 4Tc Th. but what is this side now, oh look it's

Tc plus Th all squared. So, I have just proven that the numerator

Tc plus Th squared Is greater than the denominator 4TcTh.

So, indeed, as we would expect based on our life experience.

Yes, indeed, those two rods come to an equilibrium temperature that's halfway in

between their original temperatures. Great.

Well, that has been three different systems where we've seen how we can use

reversible processes to learn about the entropy change.

Even associated with an irreversible process, knowing that there will be a

difference in total entropy of the universe.

But we know how to work with reversible changes and that allows us to accomplish

things and actually do practical computations.

we're going to go on, and next consider the relationship of entropy to the

partition function. Now before we actually get to that

though, I think it is time to do another demonstration.

And a particularly interesting one that seems to violate briefly the entropy of

mixing. I'll call this demo, the anti-entropy of

mixing. I hope you enjoy it.

In this demonstration, we'll see that sometimes we can fight thermodynamics,

almost to a draw, even if we can never prevail against one of the three laws.

On the bench here, I have a cylindrical flask, inside of which is another

cylinder of only slightly smaller diameter.

In between the two is a fluid that is clear and colorless.

At one position along the cylinder, a line of dye has been added to the fluid.

Do you see? What will happen if I turn the inner

cylinder? Certainly we know that the entropy of

mixing will favor distribution of the dye uniformly throughout the available

volume. So, let's see what happens.

Sure enough, as I turn the cylinder, the dye distributes, and, by the time I've

done about two, revolutions. The color seems to be widely distributed

throughout, even if not yet completely uniform, about as one might expect.

Now, what will happen if I reverse my turning of the inner cylinder?

We know that mixed substances cannot spontaneously separate.

Unless there is some kind of phase change.

So, the normal expectation would be for nothing obvious to happen, but let's try.

Wow. All the dye has returned, almost

perfectly, to its original position. What's happened?

Have we beaten entropy? The answer is no.

Although your eyes have been tricked into thinking it true.

The trick that is involved ,is that the very thin layer of fluid between the two

cylinders causes the flow in the liquid to be extremely uniform.

With little exchange between adjacent small volume elements.

The apparent homogeneous mixture that was created by our initial rotation of the

inner cylinder, was not actually homogeneous.

But simple uniformly distributed the dye around the circumference, with the

possibility of bringing it back through reversal of the motion.

If we were to measure carefully, we would see that there has been some broadening

of the dye line, driven by entropy. And we were to repeatedly cycle back, and

forth, with each step the mixing would become more complete.

Until indeed we did end up with a homogeneous mixture of the dye in the

fluid. So remember, the second law does allow

for reversible processes having zero entropy change.

While they're often hard to engineer in practice, they are not impossible.

And this device is one example, that illustrates a process that is very nearly

reversible.