Home

> urticator.net
  Search

  About This Site
> Domains
  Glue
  Stories

  Computers
  Driving
  Games
  Humor
  Law
> Math
  Numbers
  Science

  Continued Fractions
  Game Theory (Section)
  Group Theory
> Probability
  Miscellaneous

> An Additive Measure of Risk
  Information
  Lotteries and Expectation Values
  Up or Down?
  Coin Statistics

An Additive Measure of Risk

In Minesweeper, one often encounters situations where there's not enough information to make a completely safe move. In such situations, one naturally wants to pick the least dangerous alternative, the one yielding the smallest probability of getting blown up.

One day, when I was thinking about these probabilities, I had an idea: wouldn't it be neat if, at the end of a game, I could look back at all the dangerous situations I'd been in, add up the probabilities, and come up with a number, or score, that would tell me how dangerously I'd played? Unfortunately, as stated, the idea doesn't work, because adding up the probabilities simply isn't the right way to combine them.

Is it possible to find a quantity that can be added up? Well, if I thought in terms of success probabilities rather than failure probabilities, it would be easy—I'd just write down the probability of events A and B both occurring,

pA and B = pA pB,

and take logarithms to change the relationship from multiplicative to additive. For failure probabilities, though, it's not quite as easy. We want to write down the probability of event A or event B occurring, but we want to write it down using “and” instead of “or”, so that we get the nice multiplicative behavior. The key is to remember one of de Morgan's rules,

A or B = not (not A and not B),

which translates to the probability equation

pA or B = 1 − (1 − pA)(1 − pB).

Moving the constant 1 to the other side, taking logarithms, and changing the sign so the numbers will be positive, we find

− ln (1 − pA or B) = − ln (1 − pA) − ln (1 − pB).

Thus, the risk r, as defined below, is an additive quantity.

r = − ln (1 − p)

There are other additive quantities, namely, the multiples of r, but r is the best one, as we can see by looking at its power series.

r = p + (1/2) p2 + (1/3) p3 + …

For rare events, i.e., small probabilities, the risk is essentially the same as the probability.

(It follows that small probabilities are essentially additive, but that should come as no surprise—playing a slot machine twice, for example, does make you almost exactly twice as likely to hit the jackpot.)

That's all a bit abstract, so let me go back and show how the idea of risk applies to Minesweeper. Suppose I'm forced to make a series of moves for which the probabilities of losing are 1/3, 1/2, 1/3, and 1/2. According to the reference tables, below, the risks corresponding to the probabilities 1/3 and 1/2 are 0.405 and 0.693; I just add these up as I go, to get a total risk of 2.196. Once I have that, I can plug it into the inverse relation

p = 1 − e − r

to get the total probability of losing: 0.889.

The total probability isn't the point, though, since it's easy to compute exactly. The point is that I can use addition to compute something equivalent to the total probability. If, for example, I have to choose between making a single move with probability 1/2 and two moves with probability 1/3, I can just add up the risks, approximately 0.7 versus 0.8, and see that the former is safer.

So, that's my neat idea. I'm sure it's nothing new in the grand scheme of things, but it was new to me, and I thought I'd pass it on.

I'll finish up with some reference material. First, here's a plot of the relationship between r and p.

Second, here are some tables of common values.

pr
1/100.1000.105
1/60.1670.182
1/30.3330.405
1/20.5000.693
2/30.6671.099

rp
10.632
20.865
30.950

The values 63% and 95% in the second table may remind you of the normal distribution, but they're not the same numbers. The probabilities that sampling a normal distribution will yield a value within one or two standard deviations of the mean are 68% and 95%, respectively.

 

  See Also

  In Mathematics
  In Other Bases
  Lotteries and Expectation Values

@ December (2000)