The **arithmetic** mean of two numbers *a* and *b* is (*a* + *b*)/2.

The **geometric** mean of *a* and *b* is √(*ab*).

The **harmonic** mean of *a* and *b* is 2/(^{1}/_{a} + ^{1}/_{b}).

This post will generalize these definitions of means and state a general inequality relating the generalized means.

Let ** x** be a vector of non-negative real numbers,

**= (**

*x**x*

_{1},

*x*

_{2},

*x*

_{3}…,

*x*

_{n}). Define M

*(*

_{r}**) to be**

*x*unless *r* = 0 or *r* is negative and one of the *x*_{i} is zero. If *r* = 0, define M* _{r}*(

**) to be the limit of M**

*x**(*

_{r}**) as**

*x**r*decreases to 0 . And if

*r*is negative and one of the

*x*

_{i}is zero, define M

*(*

_{r}**) to be zero. The arithmetic, geometric, and harmonic means correspond to M**

*x*_{1}, M

_{0}, and M

_{-1}respectively.

Define M_{∞}( ** x** ) to be the limit of M

*(*

_{r}**) as**

*x**r*goes to ∞. Similarly, define M

_{-∞}(

**) to be the limit of M**

*x**(*

_{r}**) as**

*x**r*goes to –∞. Then M

_{∞}(

**) equals max(**

*x**x*

_{1},

*x*

_{2},

*x*

_{3}…,

*x*

_{n}) and M

_{-∞}(

**) equals min(**

*x**x*

_{1},

*x*

_{2},

*x*

_{3}…,

*x*

_{n}).

In summary, the **minimum**, **harmonic mean**, **geometric mean**, **arithmetic mean** and **maximum **are all special cases of M* _{r}*(

**) corresponding to**

*x**r*= –∞, –1, 0, 1, and ∞ respectively. Of course other values of r are possible; these five are just the most familiar. Another common example is the root-mean-square (RMS) corresponding to

*r*= 2.

A famous theorem says that the geometric mean is never greater than the arithmetic mean. This is a very special case of the following theorem.

If

r≤sthen M(_{r}) ≤ Mx_{s}().x

In fact we can say a little more. If *r* <* s* then M* _{r}*(

**) < M**

*x*_{s}(

**) unless**

*x**x*

_{1}

*= x*

_{2}=

*x*

_{3 }= … =

*x*

_{n }or

*s*≤ 0 and one of the

*x*

_{i}is zero.

We could generalize the means M* _{r}* a bit more by introducing positive weights

*p*

_{i}such that

*p*

_{1}+

*p*

_{2}+

*p*

_{3}+ … +

*p*

_{n}= 1. We could then define M

*(*

_{r}**) as**

*x*with the same fine print as in the previous definition. The earlier definition reduces to this new definition with *p*_{i} = 1/*n*. The above statements about the means M* _{r}*(

**) continue to hold under this more general definition.**

*x*For more on means and inequalities, see Inequalities by Hardy, Littlewood, and Pólya.

**Update**: Analogous results for means of functions, replacing sums with integrals. Also, physical examples of harmonic mean with springs and resistors.

Related post: Old math books

Of all those the one I find more counter-intuitive is M0.

This is a great post. Keep up with the good work.

These are just generalised p-norms, right? What is the “famous theorem” you refer to? Is it a consequence of Hölder’s inequality?

On the topic of inequalities, if you haven’t got it already I strongly recommend Steele’s The Cauchy-Schwarz Master Class. It’s a wonderfully readable tour through inequalities and their history.

Mark,

These means correspond to p-norms if r ≥ 1, but not for smaller values of r.

The famous theorem I refer to is the geometric mean – arithmetic mean inequality.

I agree about Steele’s book. It’s one of my favorites.

Actually, all p-norms require an absolute value: $latex left(sum_{i=1}^n |x_i|^rright)^{1/r}$

That’s pretty cool about r = 0. I had to plot it to convince myself it was true. The limit comes in from both directions, too. Now I’m trying to prove it for fun.

By “generalised” I meant exactly those cases for when r < 1. The wikipedia article on L_p spaces talks about these generalisations for 0 ≤ r < 1. I hadn’t seen the case of r < 0 before, however.

Another useful generalization is the concept of Chisini mean: Chisini was a less-known Italian mathematician. You can read the idea here:

http://en.wikipedia.org/wiki/Chisini_mean

A really excellent source of these types of inequalities is Chapter 2 of Bela Bollobas’s “Linear Analysis”. It summarizes quite a lot of Hardy, Littlewood and Polya, but with rather more up-to-date notation.