Thermodynamics Part III
Thermodynamics Part III
1
IV. Entropy and the Second Law
of Thermodynamics
Introduction
◦ We are accustomed to many irreversible process:
An egg is dropped onto a floor, a pizza is baked, a
car is driven into a lamppost, etc.
◦ For reversible processes the system is in equilibrium
with its environment, while for irreversible
processes it is not.
◦ Why these processes are irreversible??
◦ The idea of entropy provides a mathematical way to
encode the intuitive notion of which processes are
impossible.
2
IV. Entropy and the Second Law
of Thermodynamics
Introduction
◦ Entropy: A measure of the molecular “disorder”,
or randomness, of a system
◦ The term was coined in 1865 by Rudolf Clausius
from Greek en- = in + trope = a turning (point).
◦ Concept developed in response to the observation
that a certain amount of functional energy released
from combustion reactions is always lost to
dissipation or friction and is thus not transformed
into useful work.
3
What is
Entropy entropy?
02 ways to define entropy
of a system:
(1) microscopic
perspective: by counting
the ways in which the
atoms or molecules that
make up the system can be
arranged
(2) Macroscopic
perspective: in terms of
the system’s temperature
and the energy the system
gains or loses 4
What is
Entropy entropy?
The « disorder of » entropy is of the number of states that a system
can take on
02 ways to define entropy
of a system:
(1) microscopic
perspective: by counting
the ways in which the
Consider 2 systems: one with 4 atoms, one with eight
atoms
atoms or molecules that
make up the system can be
arranged
6
What is
Entropy entropy?
02 ways to define entropy
of a system:
(1) microscopic
perspective: by counting
the ways in which the
atoms or molecules that
A given configuration can be achieved in a number of different ways. make up the system can be
We call these different arrangements of the molecules microstates. arranged
The total number of ways in which we can select all six molecules is
the product of these independent ways, 7
or 6 x 5 x 4 x 3 x 2 x 1 = 720 or 6! = 720
What is
Entropy entropy?
02 ways to define entropy
of a system:
(1) microscopic
perspective: by counting
the ways in which the
atoms or molecules that
make up the system can be
However, because the molecules are indistinguishable, these arranged
720 arrangements are not all different.
8
(the number of microstates that correspond to a given configuration)
What is
Entropy entropy?
02 ways to define entropy
of a system:
(1) microscopic
perspective: by counting
the ways in which the
Configuration IV: most probable configuration, with probability 20/64 =
0.313 (system is in configuration IV 31.3% of the time)
atoms or molecules that
The molecules are evenly divided between the two halves of the box, make up the system can be
because that is what we expect at thermal equilibrium. arranged
Configurations I and VII, (all the molecules are in one half of the box) are
the least probable (probability 1/64 = 0.016 or 1.6%)
There is any probability, however small, of finding all six molecules
clustered in half of the box, with the other half empty
For large values of N: nearly all the microstates belong to the
configuration in which the molecules are divided equally between the two 9
halves of the box, and probability for configurations I an VII are mostly
What is
Entropy entropy?
02 ways to define entropy
of a system:
(1) microscopic
perspective: by counting
the ways in which the
In 1877, Boltzmann derived a relationship between the
atoms or molecules that
entropy S of a configuration of a gas and the multiplicity make up the system can be
W of that configuration: arranged
10
Exercise
Suppose that there are 100 indistinguishable
molecules in the previous box. How many
microstates are associated with the
configuration n1 = 50 and n2 = 50, and with
the configuration n1 = 100 and n2 = 0?
Interpret the results in terms of the relative
probabilities of the two configurations
11
Exercise
How many microstates are associated with the
configuration n1 = 50 and n2 = 50, and with
the configuration n1 = 100 and n2 = 0?
Configuration
(50, 50):
Configuration
(100, 0):
12
What is
Entropy entropy?
Entropy: the measure of a system’s thermal energy per
unit temperature that is unavailable for doing useful work. Because work
is obtained from ordered molecular motion, the amount of entropy is also 02 ways to define entropy
a measure of the molecular disorder, or randomness, of a system. of a system:
(2) Macroscopic
Change in entropy: perspective: in terms of
An entropy change (J/K) depends not only on the energy transferred as heat the system’s temperature
but also on the temperature at which the transfer takes place and the energy the system
When the temperature change is small relative to the temperature before and after gains or loses
the process:
Entropy is a state property (or a state function): it depends only on those states
and not at all on the way the system went from one state to the other.
To find the entropy change for an irreversible process, replace that process with any
reversible process that connects the same initial and final states.
13
Entropy as a State Function
Consider a reversible process, with the ideal gas in equilibrium states.
We have assumed that S (like V,
P, Eint) is a state function: this
First law in differential form:
can be deduced only by
experiment.
The change in entropy between the initial and final states of an ideal 14
gas does not depend on how the gas changes between the two states.
Exercise
Two identical copper blocks of mass m = 1.5
kg: block L at TiL = 60°C and block R at TiR =
20°C. The blocks are in a thermally insulated
box and are separated by an insulating shutter.
When we lift the shutter, the blocks eventually
come to the equilibrium temperature Tf = 40°C.
What is the net entropy change of the two-block
system during this irreversible process? The
specific heat of copper is 386 J/kg.K
15
Exercise
TiL = 60°C, TiR = 20°C, Tf = 40°C.
cm = 386 J/kg.K. ΔS?
16
Exercise
Suppose 1.0 mol of nitrogen gas is
confined to the left side of the
container. You open the stopcock, and
the volume of the gas doubles. What is
the entropy change of the gas for this
irreversible process? Treat the gas as
ideal.
17
Exercise
nN2 = 1 mol,V2 = 2V1. ΔS?
(1) We can determine the entropy change for
the irreversible process by calculating it for a
reversible process that provides the same
change in volume
(2) The temperature of the gas does not change
in the free expansion. Thus, the reversible
process should be an isothermal expansion
18
2nd law of
Second law of
thermodynamic
thermodynamics s
Consider a closed system of the thermal reservoir and the
ideal gas inside the piston that expands isothermally
If a process occurs in a closed
The entropy changes for the gas (which loses |Q|): system, the entropy of the
system increases for
irreversible processes and
remains constant for
The entropy changes for the reservoir (which gains|Q|): reversible processes. It never
decreases.
19
2nd law of
Second law of
thermodynamic
thermodynamics s
The entropy postulate:
If a process occurs in a closed
system, the entropy of the
system increases for
irreversible processes and
remains constant for
reversible processes. It never
The second law of thermodynamics: decreases.
If an engine is to do work on a
sustained basis, the working
In an automobile engine the substance must operate in a
working substance is a cycle.
gasoline–air mixture.
The working substance must
pass through a closed series of
thermodynamic processes, 21
called strokes, returning again
Entropy in real world: Engines Carnot engine
The two black arrowheads on the In an ideal engine, all processes
central loop suggest the working are reversible and no wasteful
substance operating in a cycle, as energy transfers occur due to,
if on a p-V plot. say, friction and turbulence.
24
Entropy in real world: Engines Carnot engine
Work done by a Carnot engine:
Work done by a Carnot engine:
for a closed cycle: Wnet = Qnet
(first law of thermodynamics)
Entropy changes:
Entropy changes: only two
reversible energy transfers as
heat, and thus two changes in
Because entropy is a state function, the entropy of the working
ΔS = 0 for a complete cycle: substance, one at temperature
TH and one at TL.
26
Entropy in real world: Engines Stirling engine
27
Exercise
Imagine a Carnot engine that operates between the
temperatures TH = 850 K and TL = 300 K. The engine
performs 1200 J of work each cycle, which takes 0.25 s.
(a) What is the efficiency of this engine?
(b) What is the average power of this engine?
(c) How much energy |QH| is extracted as heat from the
high-temperature reservoir every cycle?
(d) How much energy |QL| is delivered as heat to the low
temperature reservoir every cycle?
(e) By how much does the entropy of the working
substance
change as a result of the energy transferred to it from the
high-temperature reservoir? From it to the low-temperature
reservoir?
28
Exercise
TH = 850 K and TL = 300 K. W = 1200 J each cycle of 0.25
s.
(a) What is the efficiency of this engine?
Carnot refrigerator’s stroke is adjusted so that the work it requires per cycle
is just equal to that provided by engine X (WC = WX).
refrigerator
Efficiency of real engines: Engine
efficiency
If is true:
Here we prove that no real
engine operating between can
have an efficiency greater
Then: than that of a Carnot engine.
32
Efficiency of real engines: Engine
efficiency
Here we prove that no real
engine operating between can
have an efficiency greater
than that of a Carnot engine.
36