entropy, n.
Etymology:
< Greek τροπή transformation (lit. ‘turning’), after the analogy of energy n.
Physics.1. The name given to one of the quantitative elements which determine the thermodynamic condition of a portion of matter. Also transf. and fig. In Clausius' sense, the entropy of a system is the measure of the unavailability of its thermal energy for conversion into mechanical work. A portion of matter at uniform temperature retains its entropy unchanged so long as no heat passes to or from it, but if it receives a quantity of heat without change of temperature, the entropy is increased by an amount equal to the ratio of the mechanical equivalent of the quantity of heat to the absolute measure of the temperature on the thermodynamic scale. The entropy of a system = the sum of the entropies of its parts, and is always increased by any transport of heat within the system: hence ‘the entropy of the universe tends to a maximum’ (Clausius). The term was first used in English by Prof. Tait (see quot. 1868), who however proposed to use it in a sense exactly opposite to that of Clausius. In this he was followed (with an additional misunderstanding: see quot. 1875) by Maxwell and others; but subsequently Tait and Maxwell reverted to the original definition, which is now generally accepted.
1868 Tait Sketch Thermodynamics 29
We shall..use the excellent term Entropy in the opposite sense to that in which Clausius has employed it—viz., so that the Entropy of the Universe tends to zero.
1875 J. C. Maxwell Theory of Heat
(ed. 4)
189
(note)
,
In former editions of this book the meaning of the term Entropy
as introduced by Clausius was erroneously stated to be that part of the
energy which cannot be converted into work. The book then proceeded to
use the term as equivalent to the available energy...In this edition I
have endeavoured to use Entropy according to its original definition by Clausius.
1885 H. W. Watson & S. H. Burbury Math. Theory Electr. & Magn. I. 245
As in the working of a heat engine, the entropy of the system must be diminished by the process, that is, there must be equalisation of temperature.
1925 A. Strachey & J. Strachey tr. Freud Coll. Papers III. v. 599
In considering the conversion of psychical energy no less than of physical, we must make use of the concept of an entropy, which opposes the undoing of what has already occurred.
1933 W. E. Orchard From Faith to Faith xi. 280
The deduction which one of our greatest physicist astronomers draws
from the second law of thermodynamics: namely, that since there must be a
maximum entropy, there must have been once its maximum opposite.
1955 Sci. Amer. May 124/2
Certain combinations of balls yield a greater change in entropy than others. Those combinations in which entropy change reaches maximum value lead to solutions.
1955 Sci. Amer. June 64/1
This equilibrium state..is the thermodynamic condition of maximum entropy—the most disordered state, in which the least amount of energy is available for useful work.
1965 Financial Times 11 Aug.
Moralising by those whose industrial entropy
is an accepted fact of life is neither likely to persuade the workers
nor assist the trade unions in the task of trying to meet the nation's
difficulties.
a. Communication Theory. A measure of the average information rate of a message or language; esp. the quantity −Σpi log pi (where the pi are the probabilities of occurrence of the symbols of which the message is composed), which represents the average information rate per symbol.
1948 Bell Syst. Techn. Jrnl. 27 396
Consider a discrete source of the finite state type... There is an entropy Hi for each state. The entropy of the source will be defined as the average of these Hi weighted in accordance with the probability of occurrence of the states in question... This is the entropy of the source per symbol of text.
1953 S. Goldman Information Theory 329
The amount of language information (i.e., entropy) in the sequence is..a measure of the choice that was available when the sequence was selected.
1953 D. A. Bell Stat. Methods Electr. Engin. x. 145
Since entropy increases as the
arrangement of a system becomes less distinguishable from other
possible arrangements, while the value of a pattern for conveying
information depends on its uniqueness, the information capacity of a
signal is the negative of its entropy.
1960 D. Middleton Introd. Stat. Communication Theory vi. 301
H(X) is called the (communication) entropy of the X ensemble, since..it is the direct mathematical analogue of the more familiar entropy measure of statistical mechanics.
1964 Language 40 210
The basic probability concept, ‘entropy’, and its quantum, the ‘bit’, are now part of the metalanguage of linguistics.
b. Math. In wider use: any quantity having properties analogous to those of the physical quantity; esp. the quantity −Σxi log xi of a distribution {x1, x2,…}.
1951 Jrnl. Royal Statist. Soc. B. 13 60
The idea of selective entropy provides us with a new and important concept in the analytical theory of probability.
1961 Proc. Cambr. Philos. Soc. (Math. & Physical Sci.) 57 839
The analogue of Boltzmann's H-theorem is not a statement about the monotonicity of the entropy of the chains..but a statement about the ‘entropy’ of the frequency distribution (s1,..sn).
1968 P. A. P. Moran Introd. Probability Theory i. 50
Since −x log x is a convex function the entropy of a finite set of events is a maximum when their probabilities are equal.
No comments:
Post a Comment