0% found this document useful (0 votes)
14 views

ENTROPY

Uploaded by

gshasantopal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

ENTROPY

Uploaded by

gshasantopal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

The Second Law of

Thermodynamics
Beki Kan
Learning Objectives
● Understand why the first law does not tell us the
direction of a spontaneous process
● Give examples to spontaneous processes
● Define order, disorder and entropy
● State the second law of thermodynamics in
different ways
● Know the relation between statistical distribution
and entropy
● Describe the relationship between entropy and
the ability to do work
References

● Physical Chemistry: With Applications to


the Life Sciences by D.Eisenberg
● Biological Thermodynamics by D.T.Hayne
● Biochemistry by Mathews and van Holde
● Physical Chemistry for the Life Sciences
P.Atkins & J de Paula
● Lehninger Principles of Biochemistry by
Nelson and Cox
Favored Direction for a Process
● We place an ice cube in a glass of water at room
temperature. It melts. Why doesn’t the rest of the water
freeze instead?
● We touch a lit match to a piece of paper. The paper
burns to carbon dioxide and water. Why can’t we mix
carbon dioxide and water to form paper?
● Why do chemical and physical processes have
thermodynamically favored directions?
⚪ Lowest energy states? True for burning of paper which releases
energy as heat.
⚪ But, what about melting of ice at room temperature-energy
absorbed?
Entropy; The Second Law of
Thermodynamics
● The first law of TD does not
provide a complete description of
natural processes
● It simply says that energy is
conserved, that heat absorbed
and work done on a system are
stored as the energy.
● But it does not tell us if a given
process is spontaneous, that is, if
it proceeds on its own accord
(without outside intervention), with
very high probability. www.t3portal.org...bioenerg
The Second Law
● Consider a system composed of two
Initial state
metal blocks having different
temperatures. After a certain time t,
temperature of two blocks are 750C Ag 250C Ag
equilibrated.
● Heat flows from high temperature to
low temperature. ΔU=0. Process:
● However ΔU will be =0 also for heat heat flows
flowing from colder block, making btw
more colder to the hot block making blocks
it hotter. i.e. this flow will not violate
the first law of thermodynamics. The Final state
total energy of the blocks would
remain the same.
500C Ag 500C Ag
● But this never occurs!

ΔUsystem = 0
Spontaneous Processes

ice

H2O
ΔU ice>o
Process: ice
melts (heat ΔH ice >o
flows from
H2O

ice

H2O
Spontaneous Processes, Examples

Cards
shuffled Sucrose
H2O

Cards in order
ΔUcards = 0 Sucrose
dissolved,
….
……..
........ heat given

vacuum
…. off
. …
……. …

perfume
Partition ΔUgas = 0
removed
ΔUsystem <
Common Features of Previous Examples

⚪ Direction of spontaneous change


⚪ None of the reverse processes are forbidden by the
first law of TD (So the 1st Law is inadequate in
dealing w/ the direction of change).
⚪ Each of the spontaneous process is accompanied
by an increase in disorder (for ΔU=,> or < 0).
● Melting of ice at 25oC
● Randomness of a deck of ordered cards as they are
shuffled
● Diffusion of volatile perfume
● Dissolution of sugar in water
Entropy is a Measure of Disorder
● Indeed, it has been found that all spontaneous
processes in isolated systems are accompanied
by increasing disorder.
● A new property, which will indicate spontaneous
changes, is needed.
● And this must be a measure of disorder.
● Since heat is a less well organized form of
energy than work, the new property has some
connection with heat.
Entropy is a Measure of Disorder
● In 1850, Rudolf Clausius found a property of
state that is related to spontaneous change and
he called it entropy, S.
⚪ The word entropy comes from Greek. “en” means “in”
and “trope” means transforming or giving direction.
Literally, it means a change within.
● In time, it was realized that S is a measure of
disorder.
● Entropy is defined as S = qrev / T = J /K° or
cal / K°
Statements about the Second Law
● No process is possible where heat is transferred
spontaneously from a colder to a hotter body (unless
there is a change in the surroundings).
● No process is possible where heat can be completely
converted into work.
● A system will always move in direction that maximizes
disorder. i.e. systems tend towards maximum disorder.
⚪ In isolated systems or enclosures (system + surroundings) all
spontaneous processes are accompanied by increasing
disorder.
⚪ i.e. entropy reaches a maximum at equilibrium in an isolated
system.
Isolated Enclosure

//////////////////////////////////////////////// ● dQ = 0
surroundings System plus
/// / ● dW = 0 surroundings
/ / in an isolated
/
Qrev
SYSTEM
● dn = 0 enclosure
ΔSsys= dQrev/T /
/
/ ΔSsurr /
/
/////////////////////////////////////////////////// ● No heat, no work, no
matter exchange
ΔSsys + ΔSsurr > 0

•For a small amount of heat dQrev delivered to the system in a reversible


process at temperature T, dS = dQrev/T
•For a spontaneous process, the total entropy change of an isolated
enclosure is POSITIVE
Comments about the Second Law
● Law is equivalent to stating that entropy is a property of
the system.
● Dimensions are energy/temperature: J / K; when
multiplied by temp= dimensions of energy.
● Increase in S for spontaneous processes holds only for
isolated enclosures.
● Although the law is mathematically simple, virtually
everybody finds it difficult to grasp conceptually at first.
⚪ Common complaints: definition of dS seems arbitrary
⚪ It is not obvious why S should increase in a spontaneous
process.

● Working through examples is a good way to convince


yourselves that S increases.
What is Disorder?

Ordered system: a system


in which a number of
objects are positioned in a
completely regular and
predictable pattern
Ex. Atoms in a perfect
crystal, automobiles in a
perfect line
Disordered system contains
objects that are randomly
situated, without any
obvious pattern.
Ex. Atoms in a gas

bassethound.wordpress.com
Entropy: A Measure of the Randomness
or Disorder in a System. Examples
Low Entropy High Entropy

Ice at 0oC Water at 0oC


A diamond, at 0 K Carbon vapor at
1,000,000 K
A protein in its regular, The same protein in an
native structure unfolded, random coil
state
A sonnet from A random string of
Shakespeare letters
A bank manager’s desk A professor’s desk
The Advantages of Being Disordered

● Case 1: The teakettle and the


randomization of heat
● Case 2: The oxidation of glucose
● Case 3: Information and entropy

● Read from Box 1-3 Lehninger Principles of


Biochemistry by Nelson and Cox
What is Disorder?
● Disorder is the probability of a particle to be in
the maximum possible location.
⚪ Or/ the less precisely we can specify the state of a
system, the more disordered it is.
● Why does a positive S change always
accompany a spontaneous process in an
isolated enclosure? It is only this criterion that
entropy can “give direction” to chemical and
physical changes.
⚪ States that can be achieved in more ways are more
probable and have greater entropy.
Example: Simple Idealized System
insulation

///////////////////////////////////////////////
/ //////////////////////////////////////////////////
// / /

/ /
/ V1 V2 vacuum
vacuum /
/
/
/ P =0 P =0
/
/////////////////////////////////////////////////// ///////////////////////////////////////////////

● Ideal gas confined by a partition to the left half of the volume of an


insulated box.
● Insulation, Q =0 ,V constant, no work exchange with surroundings, work
done against zero pressure is zero ∴ ΔU=0
● Changing the V of the gas does not change the V of the system because
the system boundary is the insulated box.
Simple Idealized System. Why Does the
Gas Expand?
● Imagine removing the (massless) stop, while still doing
no work on the system (especially idealized to avoid
complicatig the problem).
● The gas will expand to fill the whole cylinder volume.
● Heat, work or U do not help answer the question.
● The gas is ideal and its energy is constant, the
temperature does not change (E= 3/2 RT)
● The reason why the gas expands is that entropy
increases when its volume increases. The system is
more disordered.
● Before expansion, molecules are in the left half; after
expansion, each molecule is free to move about in the
final volume.
Substates of the System
● What does disorder mean in this context?
● Let us consider two molecules of gas, A and B.
● Four possible substates, all w/ equal energy and equally probable.
● Doubling the V has increased the number of substates by a factor of
four. Larger number of possibilities

A A
A B B A
B B

●The gas expands because there are more substates that


correspond to occupation of the total volume than to occupation of
the original volume.
Substates of the System
● In general, the less precisely we can specify the state of the
system, the more disordered it is.
● The most disordered of two configurations is the one which
can be obtained in most ways.
● For N particles we will have 2N subsets. Probability is ¼ for
2 particles.
● For 1 mole of ideal gas, If N=Avogadro number NA= 6.02 x
1023, and if there are 2 choices, there will be 2NA possible
substates.
● The probability of finding them on one side of the box is:
⚪ 1 / 2NA = on the order of 10−23
Substates of the System

● 2NA = 10x
● NA ln 2 = x (2.3) (1)
● (6.02 X1023) (0.693) = 2.3 x
● x = (0.693) ( 6.02X 1023)
2.3
Link Between Statistical Distribution and
Entropy
● Boltzmann obtained a quantitative interpretation
of entropy. He stated that entropy is proportional
to the logarithm of microstates (substates- each
different arrangement for particles)
⚪ S = k ln.W k- Boltzmann constant (gas
constant / Avagadro’s no)
W- total number of choices
(subsets available)
● This equation gives the link between statistical
distribution and entropy.
● Maximizing S is equivalent to maximizing W
(number of subsets)
Statistical Distribution and Entropy
● There will always be many fewer ways of putting
a large no of molecules into an orderly structure
than into a disorderly one.
● Therefore, the entropy of an ordered state is
lower than that of a disordered state of the same
system.
● The minimal value of entropy is found only for a
perfect crystal at absolute zero of temperature
(-273 C= 0K).
Entropy is a Complex Subject

● Entropy is a complex subject especially


since it can be defined both
⚪ as a logarithmic function of probability
⚪ as a ratio of transferred heat to temperature
Entropy of the Universe
● The universe is an isolated enclosure. In
nature, changes are actually irreversible.
● Any spontaneous change in the universe is
accompanied with an increase in disorder.
● The universe is gradually tending toward a
state of maximum entropy. In this state, there is
no way to use any of the energy of the
molecule.
● So, the entropy function can be thought of as a
measure of our loss of our ability to make
use of energy.
Entropy of the Universe
● The second principle forbids the decrease in entropy in
the universe, but entropy may decrease in some local
system, as long as there is “more than compensating”
entropy increase in the surroundings of the system:
● ΔSsystem + ΔSsurroundings= ΔSuniverse >0

⚪ Example: a green plant cell converts CO2 gas


(random molecular position in space ) into glucose
(atoms highly ordered).
⚪ The S of the atoms have decreased. This ordering,
however, requires sunlight whose origin is in
explosive reactions in the sun (generate enormous
amounts of entropy, moving material in the sun into
greatly randomized configurations).
Entropy of the Universe

● In an irreversible process, the S of the


universe increases.
● In a reversible process, the S of the
universe remains constant.
● Real processes are nearly always
irreversible- true for biological processes.
● “Entropic Doom”
Entropy of the Universe: Another Example

● Glycolysis and aerobic respiration


⚪ Energy liberated by glucose metabolism is used
to bind Pi to ADP- decreasing entropy by
covalently fixing atoms in space.
⚪ At the same time the S in the region
surrounding the ATP is increased by the
conversion of glucose to CO2 gas and by the
metabolic heating those surroundings,
increasing molecular motion.
Entropy and Life
● Living organisms exist in a dynamic state, never
at equilibrium with their surroundings.
● Living organisms contain finite gradients. E.g.
non-equilibrium of pressure (in lungs), electrical
potential (in cell), concentration (in cell, tissues).
When the organism dies these gradients
disappear. Spontaneous disappearance of finite
non-equilibrium states creates entropy increase.
● Entropy of living organism < entropy of decaying
organism.
Entropy and Life
● Living organisms are highly ordered, nonrandom
structures, rich in information and thus, entropy-poor.
● To keep itself ordered (low entropy) an organism feeds
on low entropy foods such as glucose and converts
these to a high entropy state (CO2, H2O). It uses the
energy released for chemical, mechanical, electrical
work. Work is used to maintain the gradients that are
necessary for living beings.
⚪ The entropy decrease in living organisms is compensated by
an increase in entropy in the surroundings.
⚪ Non-equilibrium states, such as life can be kept in low entropy
only in the presence of an energy source.
Distinguishing Characteristics of Living
Matter
● Constant renewal of highly ordered structure, often
accompanied by increase in the complexity of that
structure
● Creation and perpetuation of order out of chaotic
surroundings is unique to life
● Local creation of order and complexity in matter must be
paid for by the continual expenditure of energy
⚪ from sunlight- plants.
⚪ from foodstuffs- animals .

● No living organism can be isolated from its surroundings.


Lehninger Principles of Biochemistry
by Nelson and Cox

You might also like