9

It was recently brought to my attention that my understanding of entropy is wonky at best.

In my experience, entropy was introduced (superficially at best) during general chemistry/foundations of inorganic chem and described in terms of order/disorder of a system, usually followed by some messy room analogy. In one of my textbooks, the author, Gary Wulfsberg, was discussing how a reaction tends to favor the products "if it increases the dispersal of ions/molecules over a larger volume of space..."(Wulfsberg, G., 2018). Wulfsberg goes on to say that

The measure of this dispersal or disorder is known as the entropy (S) of the substance. A positive entropy change for a reaction indicates increasing dispersal or disorder (Wulfsberg, G., Foundations of Inorganic Chemistry; ch. 4, pg. 200)

which I interpreted it as him saying that all systems that increase in entropy have a corresponding increase in disorder.

Later in my undergraduate career, Boltzmann's entropy was introduced during thermodynamics and described by my professor as the amount of micro-states available for the particles within the system and that an increase in entropy corresponds to an increase in the states that the system can exist in.

All was fine and peachy until recently, when I came across an article that discusses the Shannon Measure of Information (SMI) and entropy. In an article by Arieh Ben-Naim, he talks about how the idea of order/disorder in relation to entropy is not necessarily correct, in fact it is a fallacy that does not hold true nor can order/disorder be measured definitely for all systems. He states,

It is true that many spontaneous processes may be viewed as proceeding from an ordered to a disordered state. However, there are two difficulties with this interpretation. First, the concept of order is not well-defined, and in many processes it is difficult, if not impossible, to decide which of the two states of the system is more or less ordered.
J. Chem. Educ. 2011, 88 (5), 594–596

Additionally, he talks about how some systems do have "order parameters", however, it is not in relation to entropy and that not every process where an increase in entropy is observed has a corresponding increase in disorder. He later goes on to describe the SMI treatment of entropy "as the number of binary questions one needs to ask to find the location of the particle." Hence, if the number of yes/no questions one needs to ask to find the location of particle increases, so has the thermodynamic entropy.

So here is my question

Is entropy described by Boltzmann/statistical mechanics the same as the entropy described by Shannon information theory?

Additionally, is there any validity in relating order/disorder in describing the change in entropy of chemical systems?

Karsten
  • 40,170
  • 8
  • 67
  • 183
  • 1
    Thanks for raising this query. Hope we will see a lot of activity and various view points. The recent terminology in Atkins and others entropy as "dispersal of energy" is due to the efforts of Frank L. Lambert. Many textbooks adopted his terminology. When he was alive, he used to discuss this in a chemistry education forum. If I understood him correctly at that time, he meant to say that entropy is a measure in how many ways energy can be distributed hence the word dispersal of energy. – AChem Sep 22 '20 at 03:14
  • 2
    The biggest problem was created by Shannon himself by using the same word "Entropy" which really led to this "messy room" and "shuffled cards" stories. – AChem Sep 22 '20 at 03:15
  • 1
    @M.Farooq Absolutely, it bothered me that my understanding of such a fundamental concept was wrong. I could blame my teachers but it really is the individuals responsibility to question things and ensure the validity of ones claims. I appreciate your comments/information –  Sep 22 '20 at 03:21
  • 6
    Let's distinguish the "fundamental concept is wrong" from the "the often used, simplistic explanation for a fundamental concept is wrong/totally misleading". Two very different situations. Entropy has a very well done definition in statistical mechanics and thermodynamics, and these definitions hold well to chemistry, too. The problem is when people want to explain everything at a 5 years olds level: more often wrong than enlightening. The entropy as a disorder is a very unhelpful concept in chemical systems. – Greg Sep 22 '20 at 03:54
  • 1
    I think my comment was misinterpreted. That is meant as I was wrong, not the concept of entropy itself. That was a reflection of my own misunderstanding. Unfortunately I couldn’t edit the comment beyond five minutes after posting –  Sep 22 '20 at 03:56
  • 1
    This is by the way a huge discussion somehow ongoing. In my modest opinion is just the term disorder that is ill-defined. A big stadium with numbered places and supporters owning a ticket for a specific place can be see full of entropy but also very ordered. Yet the random selection of the place should increase both, if I am right. I am also on the line of @Greg. – Alchimista Sep 22 '20 at 10:37
  • Above it reads "should increase both entropy and disorder, if...."". – Alchimista Sep 22 '20 at 11:00
  • 2
    I'm with @Greg on this, originally it was described (by Boltzmann ?) as 'a measure of disorder' but the 'measure of' got lost on the way hence the confusion. Why not either think of it in the classical thermodynamic sense only as $\pm T\Delta S$ as related to heat or as the number of ways of placing particles into energy levels/boxes etc., which is basically Boltzmann's approach, when an atomistic approach is needed. – porphyrin Sep 22 '20 at 11:10
  • 1
    @Alchimista There are several nice examples for entropic effects in chemistry, a chemical equilibrium. For example, a rotating methyl group in a solid crystal, which you can call disorder if you want. But entropic factors can come eg from high spin / orbital multiplicity of an electronic state or high symmetry of a structure. It is very counter-intuitive to label the high symmetry cases as "disorder". – Greg Sep 22 '20 at 14:18
  • @dval98 This question has been flagged for needing more focus. wanted to let you know. IMO, this can be fixed as follows. Q2 and Q3, Q5 are opinion based and can be removed. However an answer to Q.5 can be asked as a different question using the tag [reference-request]. Merging Q.1 and Q.4 would make the question focused. However, this last idea may be wrong since I do not know enough to discern between the two. If they are not same, ask the two as two different question linking the two together. For Q.5: https://chemistry.stackexchange.com/questions/37303/resources-for-learning-chemistry – Safdar Faisal Sep 22 '20 at 15:15
  • I also recommend reading this article: https://pubs.acs.org/doi/abs/10.1021/ed079p187 – Galen Sep 22 '20 at 15:45
  • @Safdar I apologize for that. Please let me know if my edits will suffice. –  Sep 22 '20 at 16:04
  • This seems to be a very rampant bee in some highschool/undergrad teachers' bonnets. The explanation with order/disorder is qualitatively right and useful, and that's that. The Boltzman/thermodynamic/statistical interpretation is, beyond doubt, quantitatively correct. – Karl Sep 22 '20 at 18:24
  • @Greg I copy it and nothing in my comment is contrary to your last comment. But I can add more, the idea entropy - -> disorder seems also useful. It should be taken with a grain of salt. At the end we might say universe is going to be well ordered, in the sense that it is going to be empty and dark. Not to open a discussion on cosmology, but to put things to the extreme. What remains certain is that there will be no energy to extract. – Alchimista Sep 22 '20 at 18:27
  • 3
    I wonder why this question was closed. The OP has edited it to be more focused, and it seems a very legitimate question how a very common (and often taught) interpretation of a central thermodynamic concept holds up. – Greg Sep 23 '20 at 03:30
  • @Greg, the question was closed before the last revision, you can vote to reopen as the question is focused now.. – Safdar Faisal Sep 23 '20 at 10:09

1 Answers1

2

The association of entropy with disorder is anthropocentric. "Disorder" is a concept derived from our experience of the world. The association with "disorder" is clearer once we explain what we mean by "order". The following definition among those provided by Merriam-Webster most closely fits the intended meaning:

a regular or harmonious arrangement

Since regularity always implies lower entropy, all else being equal, it is therefore fair to associate "order" with lower entropy.

The association with "order" or regularity also syncs with the concept of entropy according to Boltzmann's statistical mechanical definition ($S= k_\mathrm B \log \Omega$). $\Omega$, the number of microstates available to the system, can be quantified using the entropy. More possible unique microstates implies a higher entropy. Greater regularity implies more constraints regarding arrangement of the system and therefore less possible microstates. Solids are usually more regular and therefore have lower entropy than fluid states at the same T. Same when comparing gases and liquids. The possible microstates increases from solid to liquid to gas.

This also jives with the informational-content definition. You can use less information (use a more compact description) to describe an orderly (regular) system. You need more information to describe all possible arrangements of molecules in a gas or liquid than in a solid. Think of entropy as measuring the length of the recipe required to build all possible arrangements of the system.

Buck Thorn
  • 21,621
  • 5
  • 38
  • 89
  • How would we define regularity at molecular level? Is it periodicity? How would explain regularity in liquids vs. gases? – AChem Sep 22 '20 at 18:54
  • @M.Farooq For a gas you need to describe more possible arrangements, each unique, than in a liquid, to describe the system. Similarly on going from solid to liquid (where the difference is more striking). For a simple description of a solid all possible arrangements can be considered permutations of one template. Once you describe that one template and the way of permuting positions you are done. For liquids you are somewhere inbetween. You see this e.g. in g(r). – Buck Thorn Sep 22 '20 at 19:33
  • 1
    The topic deserves going through the examples in more detail, including those in the comments to the OP. I find the idea of dispersal of energy appealing at times but not entirely satisfying. For instance, in an isolated system (say a gas in an adiabatic box) what exactly do you mean by energy "dispersal"? Are you talking about the kinetic theory of gases? The Boltzmann distribution? – Buck Thorn Sep 22 '20 at 19:38
  • Yes, I am not fully satisfied with the wordings of "energy dispersal." Frank Lambert, who was behind this terminology indeed did a big service by getting rid of order-disorders mythology from many textbooks. On the contrary, he also introduced "energy dispersal" concept. Which is a bigger evil entropy or energy dispersal? Only time will tell. – AChem Sep 22 '20 at 19:53
  • 1
    Trying to read the original work of Clausius will require a year to understand where he was coming from and what was going on in his mind when he came up with this idea. His book is available in English. – AChem Sep 22 '20 at 19:54
  • @M.Farooq It seems you are channeling Kuhn and other philosophers. Granted it is a chore to read something written in those days. Clausius was apparently mathematically minded and attempted to derive thermodynamic principles on a sound mathematical basis. Therefore dense and tough going for non-mathematicians. It would take some time to become comfortable with his style but maybe not a whole year :-) – Buck Thorn Sep 23 '20 at 07:54