You are on page 1of 4

Gibbs Paradox and the Higher Similarity - Higher

Entropy Relationship
Shu-Kun Lin
There are three kinds of orrelation of the entropy of mixing !ith similarity" The Gibbs paradox
statement# !hih has been regarded as a $ery fundamental assumption in statistial mehanis# says
that the entropy of mixing or assembling to form solid assemblages# li%uid and gas mixtures or any
other analogous assemblages suh as %uantum states# dereases disontinuously !ith the inrease in the
property similarity of the omposing indi$iduals" &ost authors aept this relastionship 'e"g" ()*+"
Some authors re$ised the Gibbs paradox statement and argued that the entropy of mixing dereases
ontinuously !ith the inrease in the property similarity of the indi$idual omponents (,*" - higher
similarity - higher entropy relationship and a ne! theory has been onstruted. entropy of mixing or
assembling inreases ontinuously !ith the inrease in the similarity" Similarity / an be easily
understood !hen t!o items - and 0 are ompared. if - and 0 are distinguishable 'minimal similarity+#
/12" 3f they are indistinguishable 'maximal similarity+# /1)"
4e are publishing $olume )2 of Entropy" 4hen 3 !as a hemistry student 3 !as fainated by
thermodynami problems# partiularly the Gibbs paradox" 3t has no! been more than )2 years sine 3
ati$ely published on this topi ()-5*" 6uring this deade# the globali7ed Information Society has been
de$eloping $ery %uikly based on the 3nternet and the term 8information9 is !idely used# but !hat is
information: 4hat is its relationship !ith entropy and other onepts like symmetry# distinguishability
and stability: 4hat is the situation of entropy researh in general: -s the Editor-in-;hief of Entropy# 3
feel it is time to offer some omments# present my o!n opinions in this matter and point out a ma<or
fla! in related studies"
Definition of Information
4e are interested in the definition of information in the ontext of information theory" 3t is a surprise
that a lear definition of the onept of 8information9 annot be found in information theory
textbooks"8Entropy as a measure of information9 is onfusing" 3 !ould like to propose a simple
definition of information.
3nformation ' I + is the amount of the data after data ompression"
3f the total amount of data is L# entropy ' S + in information theory is defined as information loss# L 1 S
= I " Let us onsider a )22G0 hard disk as an example. L 1)22G0" - formatted hard disk !ill ha$e S
1)22G0 and I 1 2" Similar examples for defining information as the amount of data after ompression
are gi$en in (>*" 0ased on this definition of information and the definition that 'information theory+
entropy is expressed as information loss# S 1 L ? I # or in ertain ases !hen the absolute $alues are
unkno!n# @S 1 @L ? @I # 3 !as able to propose three la!s of information theory (>*.
The first law of information theory: the total amount of data L 'the sum of entropy and information# L 1
S = I + of an isolated system remains unhanged"
The second law of information theory: 3nformation 'I+ of an isolated system dereases to a minimum at
e%uilibrium"
The third law of information theory: Aor a solid struture of perfet symmetry 'e"g"# a perfet rystal+#
the information I is 7ero and the 'information theory+ entropy 'alled by me as stati entropy for solid
state+ S is at the maximum"
3f entropy hange is information loss# @S 1 ?@I # the onser$ation of L an be $ery easily satisfied# @L
1 @S = @I 1 2 . -nother form of the seond la! of information theory is. the entropy S of the uni$erse
tends to!ard a maximum" The seond la! gi$en here an be taken as a more general expression of the
;urie-Rosen symmetry priniple (>#B*" The third la! gi$en here in the ontext of information theory is
a refletion of the fat that symmetri solid strutures are the most stable ones"
-nother reason to disuss strutural stability and proess spontaneity in the field of information theory
is that# to the dismay of many thermodynamiists or physial hemists# not all the hemial proesses
are related to energy minimi7ation and some proesses may ha$e nothing to do !ith energyC examples
are some of the ompliated proesses of moleular reognition or $ery simple mixing proesses !here
the proess spontaneity an be elegantly and pertinently onsidered only in information theory" 0y
'information theory+ entropy 'also sometimes kno!n as information theory entropy# informational
entropy# information entropy# or Shannon entropy+# !e mean it is a dimensionless logarithmi funtion
in information theory" The so-alled stati entropy (>* as an 'information theory+ entropy is a
nonthermodynami entropy" Dnlike thermodynami entropy# !hih is also a speial kind of
'information theory+ entropy# other kinds of entropy are not funtions of temperature T and they are not
neessarily related to energy"
Let us onsider mixing proesses to larify the differene bet!een thermodynami entropy and the
other kinds of 'information theory+ entropy"
Gibbs Paradox
Ten year ago# Ed!in Thompson Eaynes '> Euly )F,, G H2 -pril )FFI+# a physiist !ell-kno!n for his
ontributions to information theory# !as exhanging e-mails !ith me about the Gibbs Paradox" 3n his
paper (J* he had omplained that GibbKs )F2, book Statistical Mechanics 8is the !ork of an old man in
rapidly failing health# !ith only one more year to li$e" 3ne$itably# some arguments are left imperfet
and inomplete to!ards the end of the !ork9" Gibbs pro$ided a famous explanation of the paradox
bearing his name at the end of this book" -fter reading my paper ()* on Gibbs paradox# he !as asked
by me to pro$ide his most reent opinion about the Gibbs paradox# and Eaynes only told me that he !as
under intensi$e nursery are 'possibly in a hospital+ as his immediate ans!er" His situation !as that of
)F2, of Gibbs" -bout t!o months or so later# EaynesKs olleague heked his e-mail inbox and told me
in an e-mail that he had passed a!ay"
Sine then# se$eral papers and a speial issue on this topi ha$e been published in Entropy" The
follo!ing paragraphs are basially !hat 3 !ould like to ontribute to the 4ikipedia artile Gibbs
Paradox"
Today# the debate around this problem ontinues (I-)2*" 4hen GibbsL paradox is disussed# the
orrelation of the entropy of mixing !ith similarity is al!ays $ery ontro$ersial and there are three
$ery different opinions regarding the entropy $alue as related to the similarity 'Aigures )a# )b and )+"
Similarity may hange ontinuously. similarity Z 1 2 if the omponents are distinguishableC similarity
Z 1) if the parts are indistinguishable"
Figure 1. Entropy of mixing and similarity"
Entropy of mixing does not hange ontinuously in the Gibbs paradox 'Aigure )a+" Let us reall that
entropy of mixing of li%uids# solids and solutions has been alulated in a similar fashion and the Gibbs
paradox an be applied to li%uids# solids and solutions in ondensed phases as !ell as the gaseous
phase"
3n his book Mathematical o!ndations of "!ant!m Mechanics ())* Eohn $on Meumann pro$ided# for
the first time# a resolution to the Gibbs paradox by remo$ing the disontinuity of the entropy of mixing.
it dereases ontinuously !ith the inrease in the property similarity of the indi$idual omponents 'See
Aigure )b+" Nn page HJ2 of the English $ersion of this book ())* it reads that 8 """ this larifies an old
paradox of the lassial form of thermodynamis# namely the unomfortable disontinuity in the
operation !ith semi-permeable !alls""" 4e no! ha$e a ontinuous transition"9
-nother entropy ontinuity relation has been proposed by me ()-5*# based on information theory
onsiderations# as sho!n in Aigure )" - alorimeter might be employed to determine the entropy of
mixing and to either $erify the proposition of Gibbs paradox or to resol$e the Gibbs paradox"
Dnfortunately it is !ell-kno!n that none of the typial mixing proesses# !hether they are in gaseous
phase or in li%uid or solid phases# ha$e a detetable amount of heat and !ork transferred# e$en though a
large amount of heat# up to the $alue alulated as T T S @ '!here T is temperature and T S is
thermodynami entropy+# should ha$e been measured and a large amount of !ork up to the amount
alulated as @G '!here G is the Gibbs free energy+ should ha$e been obser$ed" 4e may ha$e to#
rather relutantly# aept the simple fat that the thermodynamic entropy chan#e of mixin# of ideal
#ases is always $ero# !hether the gases are different or idential 'This onlusion might be taken as an
experimental resolution of Gibbs paradox for ideal gases+" This suggests that entropy of mixing has
nothing to do !ith energy 'heat T T S @ or !ork @G+" - mixing proess may be a proess of
information loss !hih an be pertinently disussed only in the realm of information theory and
entropy of mixing is an 'information theory+ entropy" 3nstead of alorimeters# hemial sensors or
biosensors an be used to assess the information loss during the mixing proess" &ixing ) mol of gas -
and ) mol of a different gas 0 !ill ha$e the inrease of at most , bits of 'information theory+ entropy if
the t!o parts of the gas ontainer are used to reord , bits of information ' I +" Aor mixing ideal gases#
if the entropy is regarded as an 'information theory+ entropy# $on MeumannKs relation gi$en in Aigure
)b is $alid"
Aor ondensed phases# ho!e$er# instead of the !ord 8mixing9# the !ord 8merging9 an be used for the
proess of ombining se$eral parts of substane originally in se$eral ontainers" Then# it is al!ays a
merging proess# !hether the substanes are $ery different or $ery similar or e$en the same" The
on$entional !ay of entropy of mixing alulation !ould predit that the mixing 'or merging+ proess
of different 'distinguishable+ substanes is more spontaneous than the merging proess of the same
'indistinguishable+ substanes" Ho!e$er# this ontradits all the obser$ed fats in the physial !orld
!here the merging proess of the same 'indistinguishable+ substanes is the most spontaneous oneC
immediate examples are spontaneous merging of oil droplets in !ater and spontaneous rystalli7ation
!here the indistinguishable unite lattie ells ensemble together" &ore similar substanes are more
spontaneously misible" The t!o li%uids methanol and ethanol are misible beause they are $ery
similar" 4ithout exeption# all the experimental obser$ations support the entropy-similarity relation
gi$en in Aigure )" 3t follo!s that the entropyGsimilarity relation of Gibbs paradox gi$en in Aigure )a is
%uestionable" - signifiant onlusion is that# at least in the solid state# the entropy of mixing is a
negati$e $alue for distinguishable solids. mixing different substanes !ill derease the 'information
theory+ entropy# and the merging of the indistinguishable moleules 'from a large number of
ontainers+ to form a phase of pure substane has a great inrease in stati entropyOan 'information
theory+ entropy ()-5*" Starting from a binary solid mixture# the proess of merging ) mol of moleules
- to beome one phase and merging of ) mol of moleules 0 to form another phase leads to an
'information theory+ entropy inrease of ,PB"2,,)2,Hbit 1),"255)2,Hbit 1)">2B)2,H0yte# !here
B"2,,)2,H is -$ogadroKs numberC and there !ill be at most only , bits of information ' I + left" 3n
onlusion# spontaneously mixed substanes at gaseous state an be spontaneously separated at
ondensed phases 'solid or li%uid states+# dri$ing only by information loss or by the inrease in
'information theory+ entropy" Ho!e$er# none of these typial pure mixing or separation proesses are
dri$ing by free energy minimi7ation and the free energy 'or total amount of hemial potential+ has no
hange during the proesses of ideal mixture formation or ideal mixture separation" The
thermodynami entropy hange for the formation of ideal mixtures of gases# li%uids or solids is al!ays
7ero"
There are se$eral outstanding problems 'suh as symmetry breaking problem and related problems (I-
)2# ),-)F*+ !hih might be onsidered in information theory" 3 ha$e introdued information theory
onepts to the studies of strutural stability and proess spontaneity and tried hard to attak these
problems myself before launhing this <ournal )2 years ago and tried to present unambiguously my
o!n main ideas" Mo! 3 am ready to ontinue to edit and publish many papers from other sientists in
the follo!ing $olumes" ;ontributions of papers are !elomed"
%c&nowled#ements: 3 am grateful to my long time olleague 6r" 6erek &Phee for his ollaboration
and assistane" 6erek orreted the English for this ommentary"
The Gibbs Paradox
Problem.
T!o ideal gases# say# half a mole of hydrogen and half a mole of helium# are allo!ed to expand freely
into the $olume oupied by the other" The temperature is held onstant and eah gas is at an initial
pressure# p" 4hat is the entropy hange due to free expansion:
Solution.
-pplying the T-ds e%uations# !e ha$e for eah gas.

=
2
'
'
In ( n S
i i
Arom the ideal gas la!#
P
(T
'
P
(T
'
) )
> # 2 # > # 2
, ,
= =
" Thus# the entropy hange for eah gas is# @Si 1
-2">R ln'2">+" The total entropy hange is thus
( ) , In ( S
mixin#
=
Thus !hen to gases mix# the entropy hange is the same as in a free expansion of eah at onstant
temperature and e%ual initial pressure"
E%uation. 4hat if the the t!o gases are the same: 3s there an entropy inrease:
-ns!er. 3n this ase# there is no entropy hange as the free expansion of idential gases has no
meaning" Ho!e$er# if the gases are tagged in some !ay# then the entropy hange is reali7ed" The Gibbs
Paradox is <ust this anomalous result !hen the gases beome indistinguishable"

You might also like