Why is Kullback-Leibler divergence not a distance?

The Kullback-Leibler divergence between two probability distributions is a measure of how different the two distributions are. It is sometimes called a distance, but it’s not a distance in the usual sense because it’s not symmetric. At first this asymmetry may seem like a bug, but it’s a feature. We’ll explain why it’s useful to measure … Continue reading Why is Kullback-Leibler divergence not a distance?

7 Comments

Copy and paste this URL into your WordPress site to embed

Copy and paste this code into your site to embed