Two definitions of expectation

In an introductory probability class, the expected value of a random variable X is defined as

E(X) = \int_{-\infty}^\infty x\, f_X(x) \,dx

where fX is the probability density function of X. I’ll call this the analytical definition.

In a more advanced class the expected value of X is defined as

E(X) = \int_\Omega X \,dP

where (Ω, P) is a probability space. I’ll call this the measure-theoretic definition. It’s not obvious that these two definitions are equivalent. They may even seem contradictory unless you look closely: they’re integrating different functions over different spaces.

If for some odd reason you learned the measure-theoretic definition first, you could see the analytical definition as a theorem. But if, like most people, you learn the analytical definition first, the measure-theoretic version is quite mysterious. When you take an advanced course and look at the details previously swept under the rug, probability looks like an entirely different subject, unrelated to your introductory course. The definition of expectation is just one concept among many that takes some work to resolve.

I’ve written a couple pages of notes that bridge the gap between the two definitions of expectation and show that they are equivalent.

9 thoughts on “Two definitions of expectation

  1. Pingback: El teorema del estadístico inconsciente « Apuntes de Estadística
  2. Any recommendations for a book that builds the measure theoretic probability theory from the ground up?

  3. I’ve seen A Probability Path on several bookshelves. I’ve thumbed through it and it looks OK, but I can’t say I’ve read it.

    Unfortunately I can’t think of anything off-hand that I’d endorse enthusiastically. So many books on this subject get bogged down in minutia and don’t relate the measure theory to the intuitive ideas of probability.

  4. “A First Look at Rigorous Probability Theory” by Rosenthal is very good. It avoids measure theory unless it is absolutely needed (as when bridging the two definitions of expectation mentioned by John) , but remains very rigorous and builds from the ground up. The first chapter, for example, introduces the extension theorem. Definitely worth a look.

  5. I liked the way you said “Perhaps the biggest source of confusion in theoretical probability is failure to distinguish X and f sub X”. I think that’s a big confusion in “elementary” probability as well

  6. If all observations on X fall in the interval between zero and one, and E(X) is thus a proportion, are these definitions equivalent? Something (and I’m sure it’s ignorance) is bothering me.

  7. Try “Lectures on Measure and Integration” by Harold Wenden. It starts with set theory. I haven’t gotten far, but I have gotten further.

  8. I learned Lebesgue integrals years ago from Cramer, Mathematical Methods of Statistics, a book I still use. But it is not for everyone.

Leave a Reply

Your email address will not be published. Required fields are marked *