Jump to user comments
theory A measure of the disorder of a system. Systems tend
to go from a state of order (low entropy) to a state of
maximum disorder (high entropy).
The entropy of a system is related to the amount of
information it contains. A highly ordered system can be
described using fewer
bits of information than a disordered
one. For example, a string containing one million "0"s can be
whereas a string of random symbols (e.g. bits, or characters)
will be much harder, if not impossible, to compress in this
way.
Shannon's formula gives the entropy H(M) of a message M in
bits:
H(M) = -log2 p(M)
Where p(M) is the probability of message M.
(1998-11-23)