the measure of a system's energy that is unavailable for work. Since work is obtained from order, the amount of entropy is also a measure of the disorder, or randomness, of a system. If energy in the form of heat dQ is added to a system held at a constant temperature T, the change in entropy dS is given by dS = (dU + pdV)/T dQ/T, where dU is the change in energy, p is the pressure, and dV is the change in volume. For reversible processes, dS = dQ/T and S is a state variable since its value is completely determined by the current state of the systemi.e., independent of what path was followed to reach the current state. All natural processes are irreversible and involve an increase in entropy, dS > dQ/T. Entropy is an extensive property; that is, its magnitude varies from zero to the total amount of energy within a system. The concept of entropy was proposed in 1850 by the German physicist Rudolf Clausius and is sometimes presented as the second law of thermodynamics (see thermodynamics). According to this law, entropy increases during an irreversible process such as the spontaneous mixing of hot and cold gases, the uncontrolled expansion of a gas into a vacuum, and the combustion of a fuel. In one statistical interpretation of entropy, it is found that for a very large system in a thermodynamic equilibrium state, entropy S is proportional to the natural logarithm of a quantity W representing the maximum number of microscopic ways in which the macroscopic state corresponding to S can be realized; that is, S = k ln W, in which k is the Boltzmann constant. All spontaneous processes are irreversible; hence, it has been said that the entropy of the universe is increasing: that is, more and more energy becomes unavailable for conversion into mechanical work, and because of this the universe is said to be running down.
Meaning of ENTROPY in English
Britannica English vocabulary. Английский словарь Британика. 2012