Bayes' Theorem and (maybe) the Stock Market
motivated by e-mail from Robert P

Here's an interesting problem ... or two:

Problem 1

  • There are two bags: bag1 and bag2.
  • bag1 is coloured red and contains 30 black balls and 10 white balls.
  • bag2 is coloured blue and contains 15 black balls and 45 white balls.
  • Sam is asked to pick a ball from one of the bags.
  • In previous trials, Sam has chosen red 70% of the time.
  • What's the probability that he picks a ball from bag1?

>I'd say 70% ... but what's that got to do with black and white balls?
Good ... 70%.
Okay, now suppose we consider the bags again ... but with additional information:

Problem 2

  • Same as Problem 1 ... and Sam picks a ball from one of the bags and it's black.
  • What's the probability that he picked from bag1?

See? You now have additional information, compared to our first problem. You know he picked a black ball!
What's the probability that it came from bag1?

>I have no idea!
That's where Bayes comes in.

>What's Bayes?
Thomas Bayes (1702-1761) was a minister/mathematician, born near London, England and ...

>A minister and a mathematician? Isn't that a strange ...
Pay attention. Bayes set forth an analysis of situations similar to our bags and balls problem. The result is know as Bayes' Theorem.

To acquire some appreciation for Bayes' Theorem, let's proceed like so:

Note that:

  • The probability of picking bag1 is P[bag1] = 0.70   or 70% since Sam likes red
  • So the probability of picking bag2 is then P[bag2] = 1 - P[bag1] = 0.30   or 30%
  • The probability of picking black from bag1 is P[black | bag1] = 30/(30+10) =0.75  that's 75%, namely 30 black out of 40 balls in bag1
      (The notation P[black | bag1] means the probability of getting black if we know that he picked from bag1.)
  • The probability of picking black from bag2 is P[black | bag2] = 15/(15+45) = 0.25   that's 25%, namely 15 black out of 60 balls in bag2
      (As above, P[black | bag2] means the probability of black if we know that it's from bag2.)
Now:
  1. Sam performs the pick-a-ball ritual 100 Million times.
  2. He'll choose bag1 70M times and bag2 30M times   that's the 70% and 30%
  3. For the 70M bag1 choices, he'll get a black ball 75% of the time and that's 0.75*70M = 52.5M times
  4. For the 30M bag2 choices, he'll get a black ball 25% of the time and that's 0.25*30M = 7.5M times.
  5. Out out 100M trials, Sam has picked black 52.5M + 7.5M = 60M times.
  6. If we know that he's picked a black then it must be one of these 60M and also know that 52.5M of these were from bag1.
  7. Hence the probability that his black pick is from bag1 is 52.5/60 or 87.5%.
See? When we know that Sam picked black we can raise our probability (of having chosen bag1) from 70% to 87.5%.

Let's see how that 87.5% number was derived form all those other numbers, supposing that the total number of trials is N (rather than 100 Million):

It goes like this:

  1. The probability of picking bag1 is P[bag1] so P[bag1]*N is the number of times Sam picks bag1.
  2. If we know that the pick is from bag1, the probability that it's black is P[black | bag1]
    so the number of those bag1 picks that are black is P[black | bag1]*P[bag1]*N.
  3. The probability of picking bag2 is P[bag2] so P[bag2]*N is the number of times Sam picks bag2.
  4. If we know that the pick is from bag2, the probability that it's black is P[black | bag2]
    so the number of those bag2 picks that are black is P[black | bag2]*P[bag2]*N.
  5. Out of the N picks, the total number of black picks is P[black | bag1]*P[bag1]*N + P[black | bag2]*P[bag2]*N.
  6. Out of all those those black picks, P[black | bag1]*P[bag1]*N were from bag1.
  7. Hence fraction of the total black picks that were from bag1 is:
      { P[black | bag1]*P[bag1]*N } / { P[black | bag1]*P[bag1]*N + P[black | bag2]*P[bag2]*N }

which gives our probability:

P[black | bag1] P[bag1]
[1]       P[bag1 | black] =
P[black | bag1] P[bag1] + P[black | bag2] P[bag2]

>Mamma mia! That's the messiest ...
Okay, here's a somewhat simpler statement:
[A]       Bayes' Theorem    

Where A and A' are mutually exclusive events (like picking from bag1 or bag2) and B is some other event
and (as before) P[A | B] is the probability that A occurs if we know that B has occurred.

>Mutually exclusive ... huh?
It means that they can't both occur. It's one or t'other.

But do you see?
Bayes Theorem is just that magic formula [1] with A=pick from bag1, A'=pick from bag2 (the mutually exclusive events) and B=pick black.

>And this has something to do with the stock market?
We'll see, but in the meantime you can play here:
Probabiliy of event A:   P[ A ] = %
  ... note that P[ A' ] = 1 - P[ A ] assuming that events A and A' are mutually exclusive
Probabiliy of event B if A has occurred:   P[ B | A ] = %
Probabiliy of event B if A' has occurred:   P[ B | A' ] = %
P[ A | B ] = %

      P[ A | B ] = Probability of A if B has occurred


Bayes' Theorem again

I should point out another form of Bayes' Theorem ... like so:

In the magic formula [A], look again at the numerator which we'll write: P[A] P[B|A].
We can describe P[A] as the probability that A occurred or the fraction of a 100 Jillion trials where A occurred.
Then, of this fraction (where A occurred), in a certain fraction of these, B occurred ... and that's P[B|A].

Note that it's convenient to think of the "probability" of something happening as the "fraction" of a 100 Jillion trials where it happened.
If it happened 47 Jillion times, then we'd say that the probability was 47J / 100J = 0.47 or 47%.

Here's something to note:
Suppose that, in 47J trials, an event "P" happened (of the total of 100J trials).
Suppose further that, in a fraction f of these 47J trials, an event "Q" happened.
Then in f(47J) trials both P and Q happened.
As a fraction of the total number of trials, this is f(47J) / 100J = f (0.47) which is the product of the probabilities.

Anyway, considering the fraction of trials where A occurred, we noted that, in a certain fraction of these, B occurred
... and that's what we're calling P[B|A].
Conclusion?

>That means we're talking of a fraction where both occurred, right?
Yes, P[A] P[B|A] is the fraction of the 100 Jillion trials ... or the "probability" ... that both A and B occurred.
This is denoted by P[A and B].

Okay, now let's do that backwards.
The fraction of the 100 Jillion trials where B occurred is denoted by P[B].
Then, of this fraction (where B occurred), in a certain fraction of these, A occurred ... and that's P[A|B].
Then P[B] P[A|B] is the fraction of the 100 Jillion trials where both A and B occurred.

>Aren't they the same? I mean, isn't P[A] P[B|A] = P[B] P[A|B]?
Precisely:
P[A and B] = P[A] P[B|A] = P[B] P[A|B] = the probability that A and B both occurred

>Yeah, so?
So that gives us our second Bayes' Theorem:
[B]       Bayes' Theorem #2    

      which can also be written:     P[A|B] = P[AandB] / P[B]

       

>And that's useful?
Sure. Note that both [A] and [B] reverse the order ... from P[B|A] to P[A|B].

>And that's useful?
Well ... uh, in the above example we knew the probability that Sam would pick a black ball if he picked from bag1 (or bag2).

>Aah ... and we wanted to know the probability of having picked from bag1 if he picked a black.
Yes. That's like knowing P[B|A] and asking for P[A|B].

Let's do another problem, using Bayes #2.

>Do we have to?


Problem 3

Suppose that the total population is N.

Suppose that we know the proportion of the total population that died in the year 2003 is 0.009 (or 0.9%)   ... so 0.009N died.
If I asked: "What's the probability that Sam died in 2003", you'd answer 0.9%, right?

However, suppose I told you that the population that was over 75 years old was M and that the proportion of the over-75 population that died in the year 2003 was 0.05 (or 5%)   ... so 0.05M died.

>Don't tell me! Sam is over 75, right?
Right. Now what do you say concerning the probablity that Sam died in 2003?

>I have no idea?

Note that Sam is not your ordinary citizen ... he's one of those over-75 citizens.
The probability that he died is the fraction of over-75 citizens that died ... not the fraction of the total population that died!
So the probability that Sam died is 0.05 (or 5%) ... not 0.009 (or 0.9%).

Okay, we use Bayes #2 ... and this is what we'll do ...

>Wait! We already have the answer, don't we?
Yes, but we need to do an example. Listen!

  • There were N citizens alive on Jan 1, 2003.
  • The number of citizens who died (during 2003) was n.
  • Hence the probability that a citizen died was n/N.
  • The number of over-75 citizens alive on Jan 1, 2003 was M.
  • Hence the probability that a citizen was over 75 was then M/N.
  • The number of over-75 citizens who died was m.
In Bayes #2, we'll put   A = a citizen died   B = a citizen was over 75.

Okay, from all these numbers we need to get the number m / M ... that's the probability that Sam died given that he's over 75.

  • The fraction (of the total population, N) who died was n/N ... and that's P[A]
  • The fraction of those who died who were also over-75 was m/n ... that's P[B|A]
  • Then the fraction (of the total population) who were over 75 and died was (n/N)(m/n) = m/N ... that's P[A] P[B|A] = P[AandB]
  • Finally, the fraction (of the total population, N) who were over 75 was M/N ... and that's P[B]
We then get (from Bayes #2):

      the probability that Sam died, given that he was over 75 = P[A|B] = P[A] P[B|A] / P[B] = (m/N) / (M/N) = m/M.

What we'd like to do is consider calculating a probability if we increase the amount of information we have.
>zzzZZZ
We might know, from historical data, that a stock price will increase tomorrow X% of the time.
But suppose we have additional information, like the Sharpe Ratio for the stock or maybe the fact that the stock has already increased for three days in a row or maybe we know the current P/E Ratio or maybe ...
>zzzZZZ