But in any case, you know that asymptotically the difference between the two distribution functions is O(1/x^2p). The reason moments tell you more in the tails than in the middle is that for sufficiently large values of x, only the leading term in the polynomial matters.

]]>On the one hand, the title of the paper is “Moments determine the tail of a distribution (but not much else).”

On the other hand, the fact that there is a polynomial P(x) where |F(x) â€“ G(x)| â‰¤ 1/P(x) doesn’t tell us much if we don’t know what P(x) looks like. Moreover, this is an upper bound, which limits the *dissimilarity* of the distributions.

In other words, the title of the paper implies that moments don’t determine a distribution very well, whereas the result allows you to conclude nothing more than that moments *might* match a distribution quite well.

What’s the takeaway, then?

]]>I updated the post. Thanks for pointing out the error.

]]>The second link shows a connection with orthogonal polynomials.

http://en.wikipedia.org/wiki/Moment_problem

http://en.wikipedia.org/wiki/Chebyshev%E2%80%93Markov%E2%80%93Stieltjes_inequalities