Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -
Sign in to follow this  
Followers 1
The Nameless One


2 posts in this topic


Those who do not agree with that, likely will say that entropy has nothing to do with disorder (chaos), but with dispersion of energy only. Personally I would say it is the probability to find a certain object at a certain location at a certain moment in time - the more predictable, the more order and vice versa. As matter is equivalent to energy (Einstein), one can thus say that disorder (amount of chaos) is dispersion of energy. Nevertheless, in this definition only changes of entropy can be considered, because what is "absolute", "total" order (entropy = 0)?

If we consider the Universe as a whole, we could say that the initial Big Bang had zero entropy and so the entropy of the Universe today, could be given an absolute value (if we know the total amount of energy of the Universe and can give it a temperature - do/can we?). If we however consider a system, like say a bottle of water, what would then its absolute zero-entropy state be - frozen to ice at zero Kelvin, where matter as we know it, cannot exist?

We can assign any state as having zero entropy and only give deviations from that state a certain entropy value. For example in steam tables, saturated water at zero degrees Celsius (32F) is given a zero entropy, because it also is given a zero enthalpy. I have never seen entropy values for ice, though an absolute entropy scale would definitely require such. Also this clearly indicates that entropy is not a thermodynamic dimension, but a mere quantitative notion.

If we would give crude oil (as it is pumped up) a zero entropy reference, we could calculate a finite value for the CHANGE OF entropy of the combustion gases, which then have a certain entropy. This would be a very complicated procedure, so we would rather give the condensation condition of the gases at atmospheric pressure a zero entropy (and argue with someone from Mars about the "correct" value of it) . Moreover, the thermodynamic dimensions of the combustion gases (pressure, temperature, volume), in other words, the internal energies of them, are in no way effected by how we calculate the entropy to have what values and there would be no arguing with the guy from Mars about this either, so, how "real" then is entropy?

The confusion about entropy is likely of verbal origin, because we say that a system HAS a certain entropy. If just Clausius at the time hadn't introduced the word 'entropy', he/we would have said that a system HAS a certain disorder (or energy dispersion, or energy density) and then there would have been no confusion (as Shakespeare said: "what's in a name?").

Let's start from the beginning: The First Law of Thermodynamics:

Instead of saying that it is the conservation of energy, a more fundamental definition would be:

"The internal energy of a system depends on its condition only."

In other words:"Each possible condition of a system is characterized by a very distinct internal energy." If it were not so, we could create a 'perpetual mobile' of the first degree, by adding internal energy to bring a system in one condition and bring it back to its original condition, by taking out more energy than we added previously.

Internal energy of a system is defined as the sum of the kinetic energies of all molecules in that system (giving it its temperature). Normally we do not need to calculate the internal energy by considering the microscopic conditions, but can use more simple formulas, the most fundamental one being: DU = Q - W

In this equation Q is the amount of heat applied to a system, a part of which is used to increase its internal energy with the amount DU, and the rest leaving the system (negative sign) again in the form of mechanical work W, that it does on one or more of its boundaries , like moving a piston, etc., by which the volume increases. If the volume instead decreases, meaning heat AND work was applied (positive sign), the formula writes as: DU = Q + W If no work is involved, then DU = ± Q (+Q is applied, -Q is removed heat).

Very often entropy is misunderstood, when defining the change of entropy as the change of internal energy, divided by a certain temperature. So doing, one comes to the according, but wrong definition: "Each possible change of a system's entropy is characterized by a very distinct change of its internal energy".

In this way entropy is related to the first, instead of to the second law of thermodynamics (see below). The error is very understandable though, because as it is with internal energy, also the change of entropy only depends on the according start and end conditions of a system, not on the history of it. Changing the condition of a system into an other one gives the same change of entropy as when bringing it back in the original condition. Thus we could indeed talk about a "conservation of entropy". However, entropy is not determined by temperature alone, whereas the internal energy (of an inert gas) is. When bringing it in different conditions, but at constant temperature, the entropy changes, but the internal energy does not! Thus, there is no distinct relationship between changes of entropy and of internal energy.

There is a great confusion by some saying that ANY process can be treated reversibly, by taking infinitesimal (differential) changes in entropy, dS = dQ/T and then calculate the total change of entropy DS by integrating this from T1 to T2, being the end state temperatures of the process. The confusion lies in mixing up the implements of "infinitesimal" and "reversible".

Indeed, infinitesimal changes, quasistatic change of condition, as it often is called, are reversible, because infinitesimal small changes in energy can let the process go in either direction. A truly reversible process however, allows the same for the total change of energy, which an irreversible one does not.

We can consider a typical irreversible process, such as cooling and heating of a gas. No matter how it is cooled, it is impossible to heat it again to the original higher temperature, by recirculating the same energy, that was cooled off at a lower sink temperature. Even IF it in practice could be done quasistatically, the sink temperature would still be lower than the original highest temperature of the system - one has to apply additional energy to heat the system again and this is what makes the process irreversible, even though it is done quasistatically! Because irreversible processes thus always require an external energy source to be reverted, they always bring about an increase of entropy of the environment. From this we can define reversible and irreversible processes as follows:


Expressed in terms of entropy, this also complies with: The Second Law of Thermodynamics:

(Note, that it is not possible to express the First Law in terms of entropy)

"For any process by which a thermodynamic system is in interaction with the environment, the total change of entropy of system and environment can never be negative. If only reversible processes occur, the total change of entropy is zero; if irreversible processes occur as well, then it is positive."

Let's illustrate this with the following example in a TS-diagram, there we want to calculate the change of entropy for a system's (gas) change of condition. (Mind that Vo < V1 < V2 . Normally the TS-diagram is used with similar curves for constant pressure and those count upwards for higher values). Originally the system was in the coordinates T1V1 and by some unknown process compressed to the present condition T2Vo. How can we calculate the change of entropy of this by using dS = dQ/T and integrating afterwards? The change of internal energy DU = m.Cv.(T2-T1), is the (granulated) area under the curve Vo and between the lines for S2 and T1. However, if we cool the gas down to T1 at the constant volume Vo, we are not back in the original coordinates T1V1. Hence, the change of entropy cannot have been a differentially calculated: m.Cv.ln(T2/T1).

The only way to correctly calculate the change of entropy, is by first expanding adiabatically to the original temperature, T1, and then compressing isothermally to the original volume V1. The total change of entropy for the gas becomes then: DS = m.R.ln(V2/V1). As this occurs between the same isentropes S1 and S2, as the previous unknown process from T1V1 to T2Vo, the according change of entropy is the same.

We can make a simple analysis to compare this step-method with the 'reversible' differential one, for a change of condition at constant volume, from coordinates T1Vo to T2Vo. In that case, DQ = DU and we can write as follows:

reversible differential: DS = m.Cv.ln (T2/T1)

reversible Isotherm: DS = m.R.ln.(V2/Vo)

if we set these two equations equal and work it out, we get:

T2/T1 = (V2/Vo)(R/Cv) . for k = Cp/Cv and R = Cp-Cv:

T2/T1 = (V2/Vo)(k-1), which is the correct adiabatic relationship between volumes and temperatures, so the two equations are indeed equal...but only in this case!

Evidently, there are an infinite number of other possible processes between the same isentropes, starting in T1Vo, that are NOT at constant volume, all having different values for T2, but still having the same change of entropy as the one at constant volume and so the equality is gone. From this we see clearly, that the differential method, wrongly perceived as the "reversible" method, only is valid for changes at constant volume.

Even more important is to realize, that all these alternative processes came about with different amounts of energy applied to the system, part of which was wasted to the environment (irreversible processes), and so we see that the change of entropy for a system, (being the same for all, whereas the change of temperature is not), is not energy specific and thus NOT significant for the thermodynamic condition of a system. It is not a property of a system's thermodynamic condition, as internal energy is (naturally, because then entropy would be related to the first law, instead of the second).

Read this again, because even many experts in the field think otherwise. They erroneously say that entropy has nothing to do with disorder (chaos), because disorder is not a property of a system's thermodynamic condition, as they believe entropy to be. (compare shuffled cards).

Moreover we should be aware that the change of internal energy for ALL processes, is m.Cv.DT , also if the volume is not constant ( see TS-diagram above), but only as long as inert gases are concerned. For gases close to their condensation point (steam, saturated vapors, etc.), it is no longer valid, except at constant volume. Hence, the "reversible" differential method should not be applied in general - it could easily lead to miscalculations.

What then is the significance of entropy? We can best see this by considering the change of condition at irreversible free expansion. At free expansion, a gas from a reservoir at a certain pressure is connected to an other reservoir having vacuum. The gas will then expand and fill out both reservoirs at a lower pressure, halved, if the volume was doubled. If the whole system is thermally insulated from the environment, no heat is exchanged with that and no work is done by the gas, its temperature and internal energy will remain unchanged - Q/T is unchanged and one would think that also the entropy would be unchanged.

We once again see the importance of reversible processes, because, if we would apply a reversible isotherm expansion (heat must be applied to compensate for mechanical work done by the an other system), between the same thermodynamic end coordinates, the calculated increase of entropy becomes chaos. As the end coordinates are the same as those of the equivalent free expansion of consideration, the latter must have had the same increase of entropy! We see, that even when the internal energy of a system doesn't change, yet the entropy can change. The opposite is also true, when considering adiabatic (isentrope) changes of condition. The internal energy changes, but the entropy doesn't.

We can even go a step further by considering repeatedly polytrope compression and expansion of a gas between two temperatures. These are processes between adiabatic (isentrope) and isotherm. In the ideal case (no irreversibilities), it would be possible to compress and expand along the same (straight) curve in the TS-diagram. This means that during compression the same amount of heat is given off, as is absorbed during expansion - no drive power is needed to keep the sequences going! The system's change of entropy is the same negative during compression, as it is positive during expansion - the total change of entropy (system AND environment) is then zero. An according (hypothetical) device, with a hot and a cold spot, would in fact be a perpetual machine, that lets heat spontaneously flow from a colder to a warmer region!

Even though we cannot make such a machine in practice, it would be possible to build a machine (the COMPEX cycle machine , with a higher COP, than what the Carnot Rule allows. The Carnot Rule is NOT a physical limit, but hitherto a purely technological one only!


Considering free expansion, in what way differs the condition of the gas in the expanded state from the compressed state? The only difference is the increased freedom of the the gas molecules to move around at the same speed (unchanged kinetic energy) in the larger expanded volume .... which is increased disorder (lower probability to find a certain molecule in a certain location at a certain moment in time). Some oppose that the a.m. difference is a dispersion of (kinetic) energy in a larger volume, which thus in itself has nothing to do with disorder. They see this dispersion of energy as a thermodynamic change of state of the system. Sorry, but that is in comparison equal to saying that the health condition of a certain number of drunk persons would change, if they were brought from a smaller into a larger room, where they would have an increased freedom to "stumble" around.


From this we can conclude that entropy can be defined as dispersion of energy only for irreversible processes, which explains why the entropy increases for irreversible free expansion. For reversible processes there is no dispersion of energy and so only the amount of disorder gives the definition.

Are there two kinds of entropy then? The question itself demonstrates that entropy is not "real" and in general, regardless whether energy is dispersed or not, must be seen as a measure for amount of disorder.

In addition, the definition of disorder must be the previous mentioned probability to correctly predict the location of a certain item at a certain moment in time - it is the exact meaning of the notion "chaos". Hence, even without an in depth scientific analysis of Boltzman's theories, we can in this simple way conclude that there is a direct connection between probability and chaos=entropy, as expressed by Boltzman: S = k.lnW

What does this all have for practical meaning? Well, suppose that we want to revert an irreversible process, like recompressing a gas after free expansion. We would need a compressor for that, powered by an external energy source. Its developed work appears as heat, which must be cooled off to bring the gas back on its original temperature. No way to reuse that heat, to power the compressor for a next cycle again and so we simply lost that energy. Compare that with reversible, isotherm expansion, where heat was applied to keep the temperature constant during expansion. The according energy was given off as work to f.ex. a piston, that could put a flywheel in rotation - the energy is conserved there. After full expansion, the flywheel continues rotating by its inertia and recompresses the gas again. The according energy is cooled off (quasistatically) and so the system returns in the original state (loss-free case)! All we need is a reservoir that receives and gives off the same amount of heat at constant temperature (can be the atmosphere) and the process can be repeated indefinitely and spontaneously (no total change of entropy). Hence, in a practical case, the engineer would try to avoid irreversible free expansion to bring about a similar process. This though is not enough; he must also create a probability for the intended process to occur (perfect heat transfer conditions f.ex.). If he fails there, the system's efficiency becomes so low, that he yet might use irreversible free expansion instead, because it is technically the simplest way to go.

A good engineer should be able to judge these things, prior to choosing a design concept and therefore he/she must have an understanding of entropy - a great lack in present engineering education! (The costly developments of Sterling and Wankel machines, as well of commercial solar and wind power, are sad examples of NOT understanding entropy)

However, that the total change of entropy never can become negative isn't totally imperative. Since the definition of entropy is a function of the probability for a certain disorder, according to Boltzmann's S = k.ln(W) , it is "only" extremely likely. If we consider gas molecules moving around at random in a certain closed volume, it is extremely unlikely that they, at a certain moment, all would be situated in one half of that volume, but it is not absolutely impossible (even though we may have to wait a few million years to see it happen). On the other hand, if the volume is very large and there are only a few molecules moving around in it .. if it happens, there would be a spontaneous negative change of entropy.

The Nameless One

No Hope=No Fear

Share this post

Link to post
Share on other sites

*gives The Nameless One a button*

Congrats! You have got a vocabulary and a mind!

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
Followers 1

  • Recently Browsing   0 members

    No registered users viewing this page.