The previous post looked at the best approximation to a normal density by normal density with a different mean. Dan Piponi suggested in the comments that it would be good to look at the Kullback-Leibler (KL) divergence.

The previous post looked at the difference from between two densities from an analytic perspective, solving the problem that an analyst would find natural. This post takes an information theoretic perspective. Just is *p*-norms are natural in analysis, KL divergence is natural in information theory.

The Kullback-Leibler divergence between two random variables *X* and *Y* is defined as

There are many ways to interpret *KL*(*X* || *Y*), such as the average surprise in seeing *Y* when you expected *X.*

Unlike the *p*-norm distance, the KL divergence between two normal random variables can be computed in closed form.

Let *X* be a normal random variable with mean μ_{X} and variance σ²_{X} and *Y* a normal random variable with mean μ_{Y} and variance σ²_{Y}. Then

If μ* _{X}* = 0 and σ

*= 1, then for fixed μ*

_{X}*the value of σ²*

_{Y}_{Y}that minimizes

*KL*(

*X*||

*Y*) is

KL divergence is not symmetric, hence we say divergence rather than distance. More on that here. If we want to solve the opposite problem, minimizing *KL*(*X* || *Y*), the optimal value of σ²_{Y} is simply 1.