Stochastic independence

Independence in probability can be both intuitive and mysterious. Intuitively, two events are independent if they have nothing to do with each other. Suppose I ask you to guess whether the next person walking down the street is left handed. I stop this person, and before I ask which hand he writes with, I ask him for the last digit of his phone number. He says it’s 3. Does knowing the last digit of his phone number make you change your estimate of the chances this stranger is a south paw? No, phone numbers and handedness are independent. Presumably about 10% of right-handers and the same percentage of left-handers have this distinction. Even if the phone company is more likely or less to assign numbers ending in 3, there’s no reason to believe they take customer handedness into account when handing out numbers. On the other hand, if I tell you the stranger is an artist, that should change your estimate: a disproportionate number of artists are lefties.

Formally, two events A and B are independent if P(A and B) = P(A) P(B). This implies that P(A | B), the probability of A happening given that B happened, is just P(A). Similarly P(B | A) = P(B).  Knowing whether or not one of the events happened tells you nothing about the likelihood of the other. Knowing someone’s phone number doesn’t help you guess which hand they write with, unless you use the phone number to call them and ask about their writing habits.

Now lets extend the definition to more events. A set of events is mutually independent if the probability of any subset of two or more events is the product of the probabilities of each event separately.

So, let’s look at three events: A, B, and C. If we know P(A and B and C) = P(A) P(B) P(C), are the three events mutually independent? Not necessarily. It is possible for the above equation to hold and yet P(A and B) is not equal to P(A) P(B). The definition of mutual independence requires something of every subset of {A, B, C} with two or more elements, not just the subset consisting of all elements. So we have to look at the subsets {A, B}, {B, C}, and {A, C} as well.

What if A and B are independent, B and C are independent, and A and C are independent? In other words, every pair of events is independent. Is that enough for mutual independence? Surprisingly, the answer is no. It is possible to construct a simple example where

  • P(A and B) = P(A) P(B)
  • P(B and C) = P(B) P(C)
  • P(A and C) = P(A) P(C)

and yet P(A and B and C) does not equal P(A) P(B) P(C).

Here is such an example. Select a card at random from a standard deck and note its suit: (c)lubs, (d)iamonds, (h)earts, or (s)pades. Define the events A={c,d}, B={c,h}, and C={c,s}. The events A, B, and C are pairwise independent but not independent.

One thought on “Stochastic independence

  1. Alishan Khaliqe

    sir thank you i under stand easily stochastic independence and i have some question how to understand probability

Comments are closed.