Entropy creation: Short review

Entropy change ?S

The definition of ΔS is strictly valid only for reversible processes, such as used in a Carnot engine. However, we can find ΔS precisely even for real, irreversible processes. The reason is that the entropy S of a system, like internal energy U, depends only on the state of the system and not how it reached that condition. Entropy is a property of state. Thus, the change in entropy ΔS of a system between state 1 and state 2 is the same no matter how the change occurs.

This result, which has general validity, means that the total change in entropy for a system in any reversible process is zero. The entropy of various parts of the system may change, but the total change is zero. Furthermore, the system does not affect the entropy of its surroundings, since heat transfer between them does not occur. Thus, the reversible process changes neither the total entropy of the system nor the entropy of its surroundings. Sometimes this is stated as follows: Reversible processes do not affect the total entropy of the universe. Real processes are not reversible, though, and they do change total entropy. Entropy generates when the process is irreversible.

When entropy increases

No alt text provided for this image

The entropy of a substance will increase with its molecular weight and complexity of the molecule and with temperature. The entropy will increase as the pressure or concentration becomes smaller. Entropies of gases are much larger than those of condensed phases.

Boltzmann's principle

Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states (microstates) of a system in thermodynamic equilibrium. The simple example is a sample of gas contained in a container. The easily measurable parameters volume, pressure, and temperature of the gas describe its macroscopic condition (state). At a microscopic level, the gas consists of a vast number of freely moving atoms or molecules, which randomly collide with one another and with the walls of the container. The collisions with the walls produce the macroscopic pressure of the gas, which illustrates the connection between microscopic and macroscopic phenomena.

Boltzmann formulated a simple relationship between entropy and the number of possible microstates of a system, which is denoted by the symbol Ω. The entropy S is proportional to the natural logarithm of this number.

S = Kb ln Ω, Kb is constant {\displaystyle S=k_{\text{B}}\ln \Omega }

Entropy and the unavailability of energy to do work

What does a change in entropy mean, and why should we be interested in it? One reason is that entropy is directly related to the fact that not all heat transfer because heat generates disorder in a closed system.

The First Law of Thermodynamics states that in any process energy is conserved, and for a steady-state flow process

Equation [1]

[H, enthalpy, Q, heat, and W, work]

A real process must comply with equation [1], but compliance does not guarantee that the process is actually feasible.

The Second Law of Thermodynamics says that energy transformations in which entropy is reduced are not possible. From the definition of entropy,

Equation [2]

[S, entropy, T, kelvin]

and by substitution in equation [1] in equation [2]

 Equation [3]

The maximum work available from a process is therefore,

W is the maximum available work. This is the amount of work or energy that can be obtained from a reversible steady-state flow process.

Thermal energy

Heat is a form of disorganized energy

Energy comes in two forms; organized and disorganized. Organized energy has the potential to be activated (it is being stored). Disorganized energy is the energy of motion. Heat is an energy in motion. Imagine that you have an engine running. The engine is spinning a shaft, creating work. The output created from the energy is in the form of mechanical energy. You can drive a generator to produce electricity. You can do quite a lot with mechanical energy. The engine requires cooling in order to not overheat. Some of the energy supplied with the fuel is thus converted into heat. What can you do with the warm cooling water? This heat is practically a waste. You cannot convert this heat back to mechanical energy to drive your generator or shaft. You can only use it for heating and not much else. It seems like mechanical energy and electrical energy is of more worth to us than heat. When looking from the energy perspective, they are equal.

Mechanical energy

Mechanical energy is the sum of potential energy and kinetic energy. It is the macroscopic energy associated with a system. If we consider frictional forces, are negligible magnitude, the mechanical energy changes little and its conservation is a useful approximation.

Electrical energy

At the power station, electricity is generated as work from a heat engine. Work is entropy free, so we have an entropy free electron-gas at the point of generation.

However, a thermodynamic gas will always equilibrate to the available degrees of freedom. In this case it is the electronic states of the conductor in the transmission wire. There will be a distribution of microstates that make up the observed macrostate of the electron-gas and this defines the entropy of electronic current. As mentioned earlier, irreversible processes cause energy loss and thus further increase the entropy, lower voltage of the electron gas. These processes are proportional to the length of the conductor. However, this is secondary. The electrical current does have an intrinsic entropy defined by the electronic states of the conductor.

To finally conclude it looks like mechanical and electrical energies are a more ordered form of energy than thermal energy. 

Reuben Abraham

Technical Lead - Product Engineering at Trane Technologies

3 年

Thankyou for the riveting post. I have attached this beautiful curve that was generated when plotting the entropy on axis Y & the corresponding pressure on axis X , at saturated conditions of Ammonia [R717]. Observations 1) The entropy of saturated gas reduces with pressure, as pointed out in your article 2) Entropy of liquid increases with pressure. 3) As the liquid undergoes phase change isobarically, there is a rise in entropy. Would love to know your thoughts on points 2 & 3 as you had already touched upon point 1. Source: Coolpack Software, Dept. of Energy Engineering, Technical University of Denmark

  • 该图片无替代文字
回复
Nikhilesh Mukherjee

Consultant and author of two books

3 年

An important point often misunderstood The entropy is created only when thermal energy is converted to mechanical work. Hot water flowing through a pipe can lose heat by radiation from the pipe that is simply a radiation loss. In the case of water since it is incompressible the major entropy creation takes place only at phase change. Entropy loss is very specific. It is the loss of energy that could not convert to work. Thermal energy to work energy Imagine you want thermal energy to do mechanical work by pushing a piston. You put some thermal energy to do it. All thermal energy is not available to do work. Some energy [ KE ] will be lost while disordered molecules going in all directions and hitting the barrel wall. This is entropy Work energy to thermal energy Imagine adiabatic compression of the gas. You are adding work energy to the gas. The energy goes to increase the internal energy of the gas. It's an internal heat transfer. No entropy loses. Thus when thermal energy converts to work energy there is entropy loss. But the reverse is not true.

回复
Sandro Balestrino

Information Technology Network Engineer - CCNA

3 年

Nice review. One small but important addition I would make to round out the article is to include the Clausius inequality, the cyclic integral dq/dT <= 0 .

回复
Hussain Shaik

Power Platform Developer

3 年

Sir can you please give two examples where we can apply it ? Where we can observe it?other than carnot engine

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了