# Dictionary:Entropy

**1**. (en’ tr∂ pē) A thermodynamic quantity that measures the unavailable energy. Higher entropy represents increased disorder. Entropy never decreases in a reaction, according to the **second law of thermodynamics**. See *thermodynamic functions* and Figure T-2.

**2**. A set *G* has the entropy *H*(*G*):

where *N* is the minimum number of elements needed to specify *G*.

**3**. A measure of the uncertainty in a message. If *P*(*m*_{i}) is the probability that the message *m*_{i} has been transmitted, then the entropy *H*, where there are *i* possible messages, is given by

The entropy of a situation with no uncertainty is zero. Entropy is a measure of the average information content of a message.