Entropy

From Thermal-FluidsPedia

Jump to: navigation, search

Contents

Thermodynamics and statistical mechanics

There are two related definitions of entropy: the thermodynamic definition and the statistical mechanics definition. The thermodynamic definition was developed in the early 1850s by Rudolf Clausius and essentially describes how to measure the entropy of an isolated system in thermodynamic equilibrium. Importantly, it makes no reference to the microscopic nature of matter. The statistical definition was developed by Ludwig Boltzmann in the 1870s by analyzing the statistical behavior of the microscopic components of the system. Boltzmann went on to show that this definition of entropy was equivalent to the thermodynamic entropy to within a constant number which has since been known as Boltzmann's constant. In summary, the thermodynamic definition of entropy provides the experimental definition of entropy, while the statistical definition of entropy extends the concept, providing an explanation and a deeper understanding of its nature.

Thermodynamic entropy is a non-conserved state function that is of great importance in the sciences of physics and chemistry.[6][7] Historically, the concept of entropy evolved in order to explain why some processes are spontaneous and others are not; systems tend to progress in the direction of increasing entropy.[8] Entropy is as such a function of a system's tendency towards spontaneous change.[8][9] For isolated systems, entropy never decreases.[7] This fact has several important consequences in science: first, it prohibits "perpetual motion" machines; and second, it suggests an arrow of time. Increases in entropy correspond to irreversible changes in a system, because some energy must be expended as waste heat, limiting the amount of work a system can do.[6][10][10][11][12][13]

In statistical mechanics, entropy is essentially a measure of the number of ways in which a system may be arranged, often taken to be a measure of "disorder" (the higher the entropy, the higher the disorder).[6][11][12][14][15][16][17] This definition describes the entropy as a measure of the number of possible microscopic configurations of the individual atoms and molecules of the system (microstates) which would give rise to the observed macroscopic state (macrostate) of the system.

An everyday analogy to entropy can be demonstrated by mixing salt and pepper in a bag. Separate clusters of salt and pepper will tend to progress to a mixture if the bag is shaken. Furthermore, this example demonstrates how a process can be thermodynamically irreversible. The separation of the mixture into separate salt and pepper clusters via the random process of shaking is statistically improbable and practically impossible because the mixture has a high amount disorder. This is rendered in popular language by the saying "you can turn an aquarium into fish soup but can you can never turn the fish soup back into an aquarium" once the effect of entropy becomes irrevocable after certain threshold had been passed.

Entropy and the Second Law

The second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system. Hence, in a system isolated from its environment, the entropy of that system will tend not to decrease. It follows that heat will not flow from a colder body to a hotter body without the application of work (the imposition of order) to the colder body. Secondly, it is impossible for any device operating on a cycle to produce net work from a single temperature reservoir; the production of net work requires flow of heat from a hotter reservoir to a colder reservoir. As a result, there is no possibility of a "perpetual motion" system. Finally, it follows that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient.

It follows from the second law of thermodynamics that the entropy of a system that is not isolated may decrease. An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics.

In mechanics, the second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work. The entropy change of a system at temperature T absorbing an infinitesimal amount of heat δq in a reversible way, is given by \frac{\delta Q}{T}. More explicitly, an energy TRS is not available to do useful work, where TR is the temperature of the coldest accessible reservoir or heat sink external to the system. For further discussion, see Exergy.

Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system.[1]

Statistical thermodynamics

Statistical mechanics views entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties (such as temperature, pressure and volume) have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible microstates. In contrast to the macrostate, which characterizes plainly observable average quantities, a microstate specifies all molecular details about the system including the position and velocity of every molecule. The more such states available to the system with appreciable probability, the greater the entropy.

More specifically, entropy is a logarithmic measure of the density of states:

S = - k_{B}\sum_i P_i \ln P_i \!

where kB = 1.38065*10-23 J K−1 is the Boltzmann constant, the summation is over all the microstates the system can be in, and the Pi are the probabilities for the system to be in the ith microstate. For almost all practical purposes, this can be taken as the fundamental definition of entropy since all other formulas for S can be mathematically derived from it, but not vice versa. (In some rare and recondite situations, a generalization of this formula may be needed to account for quantum coherence effects, but in any situation where a classical notion of probability makes sense, the above is the entropy.)

In what has been called "the most famous equation of statistical thermodynamics", the entropy of a system in which all states, of number Ω, are equally likely, is given by

S = k_{B} \ln \Omega,\!

In thermodynamics, such a system is one in which the volume, number of molecules, and internal energy are fixed (the microcanonical ensemble). The entropy is expressed in units of J·K−1.

In essence, the most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system.[20] This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model.

The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has very deep implications: if two observers use different sets of macroscopic variables, then they will observe different entropies. For example, if observer A uses the variables U, V and W, and observer B uses U, V, W, X, then, by changing X, observer B can cause an effect that looks like a violation of the second law of thermodynamics to observer A. In other words: the set of macroscopic variables one chooses must include everything that may change in the experiment, otherwise one might see decreasing entropy![21]

Classical thermodynamics

From a macroscopic perspective, in classical thermodynamics the entropy is interpreted as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy[citation needed]. More precisely, in any process where the system gives up energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat (TR is the temperature of the system's external surroundings). Otherwise the process will not go forward. In classical thermodynamics, the entropy of a system is defined only if it is in thermodynamic equilibrium.

Clausius states the mathematical expression for this theorem is as follows. Let δq be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and T the absolute temperature of the body at the moment of giving up this heat, then the equation:

\oint \frac{\delta Q}{T} \ge 0

must hold good for every cyclical process which is in any way possible, and the condition of equality for this equation will hold true for any reversible cyclical process.

This is the essential formulation of the second law and one of the original forms of the concept of entropy. It can be seen that the dimensions of entropy are energy divided by temperature, which is the same as the dimensions of Boltzmann's constant (kB) and heat capacity. The SI unit of entropy is "joule per kelvin" (J K−1). In this manner, the quantity ΔS is utilized as a type of internal energy, which accounts for the effects of irreversibility, in the energy balance equation for any given system. In the Gibbs free energy equation, ΔG = ΔH − TΔS, for example, which is a formula commonly utilized to determine if chemical reactions will occur spontaneously, the free energy related to entropy changes, TΔS, is subtracted from the "total" system enthalpy ΔH to give the "free" energy ΔG of the system.

In a thermodynamic system, pressure, density, and temperature tend to become uniform over time because this equilibrium state has higher probability (more possible combinations of microstates) than any other. In the ice melting example, the difference in temperature between a warm room (the surroundings) and cold glass of ice and water (the system and not part of the room), begins to be equalized as portions of the heat energy from the warm surroundings spread out to the cooler system of ice and water.

Over time the temperature of the glass and its contents and the temperature of the room become equal. The entropy of the room has decreased as some of its energy has been dispersed to the ice and water. However, as calculated in the example, the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased. In an isolated system such as the room and ice water taken together, the dispersal of energy from warmer to cooler always results in a net increase in entropy. Thus, when the "universe" of the room and ice water system has reached a temperature equilibrium, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.

A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there will be no net exchange of heat or work - the entropy change will be entirely due to the mixing of the different substances. At a statistical mechanical level, this results due to the change in available volume per particle with mixing.[22]

Entropy versus heat and temperature

Loosely speaking, when a system's energy is divided into its "useful" energy (energy that can be used, for example, to push a piston), and its "useless energy" (that energy which cannot be used to do external work), then entropy can be used to estimate the "useless", "stray", or "lost" energy, which depends on the entropy of the system and the absolute temperature of the surroundings. As the "useful" and "useless" energy both depend on the surroundings, neither one is a function of the state of the system, and both can be quite tricky to quantify. This stands in contrast to the system's Gibbs free energy (for isobaric processes), Helmholtz free energy, entropy, and temperature, all of which are well-defined functions of state. The Gibbs and Helmholtz free energies depend on the temperature of the system (not the surroundings), and do not purport to measure the "useful" energy.

When heat is added to a system at high temperature, the increase in entropy is small. When heat is added to a system at low temperature, the increase in entropy is great. This can be quantified as follows: in thermal systems, changes in the entropy can be ascertained by observing the temperature while observing changes in energy. This is restricted to situations where thermal conduction is the only form of energy transfer (in contrast to frictional heating and other dissipative processes). It is further restricted to systems at or near thermal equilibrium. In systems held at constant temperature, the change in entropy, ΔS, is given by the equation[23]

\Delta S  = \frac{Q}{T},

where Q is the amount of heat absorbed by the system in an isothermal and reversible process in which the system goes from one state to another, and T is the absolute temperature at which the process is occurring.[24]

If the temperature of the system is not constant, then the relationship becomes a differential equation:

dS  = \frac{\delta q}{T}.

Then the total change in entropy for a transformation is:

 \Delta S = \int \frac{ \delta q }{T}.

This thermodynamic approach to calculating the entropy is subject to several narrow restrictions which must be respected. In contrast, the fundamental statistical definition of entropy applies to any system, including systems far from equilibrium, and including experiments where "heat" and "temperature" are undefinable. In situations where the thermodynamic approach is valid, it can be shown to be consistent with the fundamental statistical definition.

In any case, the statistical definition of entropy remains the fundamental definition, from which all other definitions and all properties of entropy can be derived.

References