Entropy: Disorder and the unavailability of energy – A short review
Credit: Google

Entropy: Disorder and the unavailability of energy – A short review

Contents

Definition of entropy

Significance of entropy in the thermodynamics

How does thermodynamics look at entropy?

Entropy and disorder

Boltzmann's entropy

What are the ‘macro’ and ‘microstate?

Entropy, microstate, disorder, and equilibrium

Patterns in the entropies of substances

Entropy and Free energy changes in chemical reactions

Entropy and 2nd law of thermodynamics

Entropy changes in reversible and irreversible processes

Irreversibility, Entropy Changes, and ``Lost Work''

??There are some subjects that never become old. We go back after a while and try to rediscover the different dimensions of the subject again and again. Entropy is one of such fascinating subjects. While the entropy concept was first conceived by Rankine in 1850 Rudolf Clausius in 1865 when they were trying to understand what is " lost work" still entropy has remained a mystery with many unanswered questions

What is entropy?

The concept of entropy is described by two principal approaches. The classical thermodynamics and statistical mechanics. The classical approach defines entropy in terms of macroscopically measurable physical properties, such as bulk mass, volume, pressure, and temperature. The statistical definition of entropy defines it in terms of the statistics of the motions of the microscopic constituents of a system like photons, phonons, spins, etc. The two approaches form a consistent, unified view of the same phenomenon as expressed in the second law of thermodynamics that says in very simple language at least to begin with,

“There is a tendency in nature for systems to proceed toward a state of greater disorder or randomness. Entropy is a measure of the degree of randomness or disorder of a system”.

Definition of entropy

In simple language, heat transfer from hot to cold is related to the tendency in nature for systems to become disordered and for less energy to be available for use as work. The entropy of a system is a measure of its disorder and of the unavailability of energy to do work.

Significance of entropy in the thermodynamics:

The simple definition of energy is the ability to do work. Entropy is a measure of how much energy is not available to do work. Although all forms of energy are interconvertible, and all can be used to do work, it is not always possible, even in principle, to convert the entire available energy into work. Since thermodynamics deals with the conversion of heat to energy, unavailable energy became an interesting part of thermodynamics. This attracted the interest of legendary scientists like Carnot, Rankine, Clausius, Boltzmann, and many and ‘entropy’ became a fascinating subject that we study again and again and try to understand and discover more from what they said.

How thermodynamics looks at entropy

The credit goes to Boltzmann and his famous equation S = Kb log W.?

The association of entropy with the disorder is a matter of big confusion and is still questioned. While most of the other concepts of the subject (such as temperature, density, or energy) are understood, entropy is a rather complexly defined concept. The fact that it cannot be measured directly (there are no “entropy meters”) adds to the confusion for many.

Thermodynamics textbooks associate disorder with entropy.

“Entropy is a property of matter that measures the degree of randomization or disorder. Whenever molecular chaos is produced, the ability to do useful work reduces”?

No doubt we wonder how entropy - disorder and lost work are related? No textbook gives a clear and straightforward answer. But it does not stop you to find your own interpretation if you read the subject again and again. Let us try one example.

Imagine a bowling alley.

You throw five balls and let another person throw the same number of balls from the other side. When these ten balls start hitting each other there is a maximum disorder. The movement of the balls becomes fewer and reaches near-equilibrium. When this happens the balls have reached the maximum state of disorder or entropy with limited ability to move or do work. When something cannot move or its ability to move reduces its ability to do work reduces. Their all kinetic energies were near lost in the chaos and disorder as they move towards a state of equilibrium. This is entropy. This is entropy's relation with disorder and lost work. This is entropy's relation with equilibrium.

While the origin of ‘entropy’ is still debated, entropy remains the basis of the 2nd law of thermodynamics. We intend to focus more on what is known about entropy rather than opening another debate.

Entropy and disorder have associations with equilibrium. According to Gibbs equation, ?G = ?H – TdS. ?G Is the amount of "free" or "useful" energy available to do work. A system is considered to have reached equilibrium when ?G = 0.?As TdS which is a product of entropy S and temperature T increases, the enthalpy H reduces. When both are equal ?G = 0. This means the system has reached equilibrium. It has no energy to do useful work. Therefore, the entropy S is a measure of how far the system is from equilibrium. This is another way of explaining entropy.

Entropy and disorder

The association of entropy with disorder started with the work of Boltzmann’s equation. Boltzmann's equation also known as Boltzmann–Planck equation is a probability equation relating the entropy S, as S = Kb log W. Here, Kb is the Boltzmann constant equal to 1.380649 × 10^-23 J/K and W is the number of real microstates corresponding to the gas's microstate. In short, a microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of its thermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume, and density. Essentially a microstate is a way an atom or molecule can arrange itself in a macro thermodynamic system.

Boltzmann's entropy

Boltzmann made the concept of entropy more understandable.

Boltzmann's entropy describes the system when all the accessible microstates are equally likely. When this happens, the system is in maximum disorder and the system is in equilibrium [Gibbs equation - explained above]. When the randomness or disorder is maximum there is a lack of distinction (or information) of each microstate. A microstate is a specific microscopic configuration of a thermodynamic system that the system may occupy with a certain probability in the course of its thermal fluctuations. In contrast, the macrostate of a system refers to its macroscopic properties, such as its temperature, pressure, volume, and density, Essentially, a microstate is how atoms and molecules arrange in a system.

Therefore, Boltzmann’s equation connects entropy with microstate, and Gibbs equation connects entropy with free energy stating when there is maximum disorder there is maximum entropy and the system is near the equilibrium. Entropy change is a measure of how far the system is from equilibrium. While entropy as a value is not so much important as the entropy change.

What are the ‘macro’ and ‘microstate?

A macrostate is a thermodynamic state of any system P, V, T, H[enthalpy] and a number of moles of each constituent. Thus, a macrostate does not change over time.

In contrast, a microstate for a system is all about time and the energy of the molecules in that system. In a system, its energy is constantly being redistributed among its particles. Each specific way, each arrangement of energy of each molecule in the whole system at one instant is called a microstate. As one molecule hits another it slows down while the other starts moving faster. At every instant, there is a new microstate.

Entropy, microstate, disorder, and equilibrium

?I will repeat the example of the bowling alley used above. I find it very appropriate to understand ' entropy'

Example: Imagine a bowling alley. ?You throw five balls straight along the track. Very unlikely there will be many collisions. Now another person throws five balls from the opposite direction. There are now five balls moving in one direction and five balls moving in the opposite direction. Every time a ball collides with another it slows down and the other starts moving faster. This is what is a new microstate for each ball at every instant. At every instant, there is a change in the position of each ball with respect to each other. ?Now imagine all ten balls start hitting each other randomly. There will be chaos and disorder. When the disorder or the chaos becomes maximum, you go towards a state of equilibrium, no ball will be able to move freely. As disorder increases the balls lose the freedom to move freely. Their ability to do work becomes increasingly limited. Thus, you can see how disorder can increase entropy and push a system towards equilibrium. The same thing happens with atoms or molecules. In a given space if you increase temperature or kinetic energy entropy increases.?You can imagine what must be happening when 6.2x10^23 molecules for every mole of gas are hitting each other.

Patterns in the entropies of substances

Please refer to the table below. Some patterns emerge when these values are compared.

-The entropies of gases are much larger than those of liquids, which are larger than those of solids (columns 1, 3, and 4). Solids have the least entropy. Gases have the maximum entropy. Liquids come in the middle. Liquid, molecules are not as rigid as solid liquids. Liquid atoms/molecules are free to move around with respect to each other, while gas molecules are certainly more disordered than liquid. The atoms in solids are constrained to one position; they can only vibrate around that position. Therefore, solids have the least entropy.

-Entropies of large, complicated molecules are greater than those of smaller, simpler molecules (column 2).

-Large, complicated molecules have more disorder because of the greater number of ways they can move around in three-dimensional space.

No alt text provided for this image

?Entropies of ionic solids are larger when the bonds within them are weaker (columns 3 and 4).

If you think of ionic bonds as springs, a stronger bond will hold the ions in place more than a weaker bond. Therefore, the stronger bond will cause less disorder and less entropy.

Two more patterns emerge from considering the implications of the first three.

-Entropy usually increases when a liquid or solid dissolves in a solvent.

Before mixing, the solute and solvent are completely separated from each other. After mixing, they are completely interspersed within each other. Thus, the entropy increases.

-Entropy usually decreases when a gas dissolves in a liquid or solid.

Entropy and Free energy changes in chemical reactions

The entropy, dS, and enthalpy ?H changes are two driving forces for a chemical reaction, ?G = ?H – TdS, a reaction can happen only if ?G is negative. In order to have ?G negative, you require both dH and dS. Please see the relation of dG with dH and dS in the comments.

Chemical reactions tend to increase the total entropy of the system.

No alt text provided for this image

The table shows how entropy change can contribute to the free energy change of a chemical reaction.

Entropy and 2nd law of thermodynamics

The first and second law of thermodynamics

The first law of thermodynamics defines the relationship between the various forms of energy present in a system (kinetic and potential), the work which the system can perform, and the transfer of heat. The law states that energy is conserved in all thermodynamic processes.

There are two major things that are missing in the first law of thermodynamics [1] every thermodynamic system has a surrounding unless every system is an isolated system on the earth which is a theoretical concept and [2] no distinction between reversible and irreversible system.

The limitation of the first law of thermodynamics:

-It does not say anything about the direction of the flow of heat.

-It does not say anything whether the process is a spontaneous process or not.

Examples

A cup of hot coffee left in a cooler room eventually cools off. The reverse of this process- coffee getting hotter as a result of heat transfer from a cooler room does not take place.

Water flows downhill whereby potential energy is converted into K.E. Reverse of this process does not occur in nature.

The fact is, the heat doesn’t convert completely into work. If it would have been possible to convert the whole heat into work, then we could drive ships across the ocean by extracting heat from the water of the ocean.

Another example

When a hot object is put in contact with a cold object. eventually, they both achieve the same equilibrium temperature. If we then separate the objects they do not naturally return to their original (different) temperatures.

Famous scientists like Clasius, Kelvin, Carnot, and Boltzmann understood the conflict proposed various forms of another law of thermodynamics known as “the second law of thermodynamics”. The description of the second law stated with the definition of a new state variable called entropy that is how the ' entropy' concept was born. The second law states that there exists a useful state variable called entropy

The second law of thermodynamics

According to the 2nd law of thermodynamics, the change in entropy (delta S) is equal to the heat transfer (delta Q) divided by the temperature (T). For a given physical process, the entropy of the system and the environment will remain constant if the process can be reversed. If we denote the initial and final states of the system by "i" and "f", Sf = Si (reversible). An example of a reversible process would be ideally forcing a flow through a constricted pipe. (Ideal means no boundary layer losses). As the flow moves through the constriction, the pressure, temperature, and velocity would change, but these variables would return to their original values downstream of the constriction. The state of the gas would return to its original conditions and the change of entropy of the system would be zero, ds = 0.

The second law states that if the physical process is irreversible, the entropy of the system and the environment must increase; the final entropy must be greater than the initial entropy, dS > 0.?An example is when a hot object is put in contact with a cold object. eventually, they both achieve the same equilibrium temperature. If we then separate the objects they do not naturally return to their original (different) temperatures. The process of bringing them to the same temperature is irreversible.

The application of the second law tells us why heat is transferred from the hot object to the cool object. Let us assume that the heat is transferred from the hot object (object 1) to the cold object (object 2). The amount of heat transferred is Q and the final equilibrium temperature for both objects we will call Tf. The temperature of the hot object changes as the heat is transferred away from the object. The average temperature of the hot object during the process we will call Th and it would be the average of T1 and Tf. Similarly, for the cold object, the final temperature is Tf and the average temperature during the process is Tc which is the average of Tf and T2.

The entropy change for the hot object will be (-Q/Th), with the minus sign applied because the heat is transferred away from the object. For the cold object, the entropy change is (Q/Tc), positive because the heat is transferred into the object.

So the total entropy change for the whole system would be given by the equation Sf = Si -Q/Th + Q/Tc, with Si and Sf being the final and initial values of the entropy. Th will always be greater than Tc, because T1 is greater than T2. So the term (Q/Tc) will always be greater than (-Q/Th) and therefore, Sf will be greater than Si, as the second law predicts.

?If, instead, we had assumed that the heat was being transferred from the cold object to the hot object, our final equation would be Sf = Si +Q/Th -Q/Tc. The signs on the terms would be changed because of the direction of the heat transfer. Th would still be greater than Tc, and this would result in Sf being less than Si. The entropy of the system would decrease which would violate the second law of thermodynamics.

Irreversibility, Entropy Changes, and ``Lost Work''

Reversible process

A reversible process is one carried out in tiny steps after which, when undone, both the system and surroundings (that is, the world) remain unchanged. Although true reversible change cannot be realized in practice, it can always be approximated.

As a process is carried out in a more reversible manner, the value of w approaches its maximum possible value, and q approaches its minimum possible value.

Although q is not a state function, the quotient qrev/T is, and is known as the entropy

Irreversible process

This is just the opposite.

The most widely cited example of an irreversible change is the free expansion of a gas into a vacuum. Although the system can always be restored to its original state by recompressing the gas, this would require that the surroundings perform work on the gas. Since the gas does no work on the surrounding in a free expansion (the external pressure is zero, so, PΔV=0,) there will be a permanent change in the surroundings.

Another example of irreversible change is the conversion of mechanical work into frictional heat; there is no way, by reversing the motion of weight along a surface, that the heat released due to friction can be restored to the system.

Lost work

Reversible system

According to the 2nd law of thermodynamics, the change in entropy (delta S) is equal to the heat transfer (delta Q) divided by the temperature (T). For a given physical process, the entropy of the system and the environment will remain constant if the process can be reversed. If we denote the initial and final states of the system by "i" and "f", Sf = Si (reversible). In other words,

Qf/T = Qi/T

The initial system losses entropy Q/T and the final system gains entropy Q/T

The total entropy change of system plus surroundings is zero.

The conclusion is that for a reversible process, no change occurs in the total entropy produced, i.e., the entropy of the system plus the entropy of the

dS total = dS initial + dS final = 0

Irreversible system

In the irreversible process, the system receives heat??Q?and does work ?W. The change in internal energy for the irreversible process is,

dU = ?Q -?W [ Always true for first law] ------- [1]

[ note Q and W are path functions and inexact differential]

For the reversible process,

dU = TdS - ?Wrev -------- [2]

?Since U the internal energy is point function, it is the same in the two processes Equating the changes in internal energy in the above two [1] and [2] expressions give

?Q actual -?W actual = TdS - ?Wrev

TdS = ?Q actual -?W actual + ?Wrev

dS = ?Q actual/T + 1/T [ ?Wrev --?W actual]. If the process is not reversible, we get less work than in a reversible process, since ?W actual < ?Wrev, so that for the irreversible process, dS > ?Q actual/T

?

?

Credit: Google

?

jamal abdulqader

Researcher in Energy Alternatives, Traditional and Non-Traditional, & Inventor.

1 年

Future Energy. Technology & a Radical Solution. * Holds a Certificate from the International Research Authority - Geneva. A+ * Holds a Patent - local. Clean - Zero Carbon , Renewable, Stable, Continuous, Sustainable, $0.01, No Batteries, Safe, up to Capacity 1972. Thanks PHd

Nikhilesh Mukherjee

Consultant and author of two books

2 年

Octai Dersamet When I write a post I keep myself ready to face critiques. This is particularly so when I write something on a controversial subject like ' entropy' I guess this post has passed through your gate since there are no comments from you. Of course, it is quite possible, you haven't read it yet. In the past, we had many exchanges of notes on different topics of thermodynamics like isolated systems. We had disagreements too. Regards

Krishna Mani

Director Honeywell Sustainability Centre of Excellence

2 年

Nice conceptual explanation. Thanks for sharing sir

要查看或添加评论,请登录

社区洞察

其他会员也浏览了