Home

> urticator.net
  Search

  About This Site
> Domains
  Glue
  Stories

  Computers
  Driving
  Games
  Humor
  Law
> Math
  Numbers
  Science

  Continued Fractions
  Game Theory (Section)
  Group Theory
> Probability
  Miscellaneous

  An Additive Measure of Risk
> Information
  Lotteries and Expectation Values
  Up or Down?
  Coin Statistics

Information

Information theory is a funny thing. I see it mentioned here and there—on the back cover of Codes and Cryptography, for example—as if it were a large body of knowledge, but as far as I can tell, the whole thing boils down to one single little fact: it is possible to quantify information.

Here's how it works. If a probabilistic event has outcomes indexed by i, with probabilities pi, then each occurrence of the event carries the following amount of information, in bits.

H = − sum ( pi log2 pi )

If, for example, there are N equally likely outcomes, then the amount of information is log2 N.

To see that the definition makes sense, suppose we generate a random number with eight bits in the computer sense, i.e., a random number between 0 and 255, inclusive. In that case, N = 256, and the event carries log2 256 = 8 bits of information, as expected.

As another example, let's think about presidential elections. If there are two candidates, both equally likely, then, as I said in The Problem, each vote carries exactly one bit of information. But what if there are more candidates? Say, four? If the candidates were all equally likely, each vote would carry two bits of information … but that's hardly realistic. If, instead, two candidates polled at 48% and the other two at 2%, each vote would carry about 1.12 bits of information … essentially, still just one.

The other thing that's neat about information is that it has a physical meaning. A system in thermal equilibrium is always moving from one (quantum) state to another, and occupies any particular state with probability e−E/kT, where E is the energy of the state, T is the temperature, and k is Boltzmann's constant. We can compute the information carried by that probability distribution; and if we multiply by Boltzmann's constant, and divide by log2 e, we get the entropy of the system … the exact same entropy as in the second law of thermodynamics!

In chemistry, I learned that only differences in entropy could be measured, but later, in physics, I think I read otherwise, that there was some situation in which the actual value of the entropy mattered. Unfortunately, I can't remember the details right now.

My favorite reference for such things is Thermal Physics, but of course there are plenty of others.

 

  See Also

  Anonymity
  Not Liking Uncertainty
  Problem, The
  Quantum Teleportation

@ May (2002)