@@ -20,24 +20,31 @@ kernelspec:
2020</div>
2121```
2222
23- # Measuring Distance Between Distributions
23+ # Statistical Divergence Measures
2424
2525``` {contents} Contents
2626:depth: 2
2727```
2828
2929## Overview
3030
31- Divergence measures quantify the "distance" or dissimilarity between probability distributions.
31+ A statistical divergence is a function that quantifies discrepancies between two distinct
32+ probability distributions that can be challenging to distinguish for the following reason:
33+
34+ * every event that is has positive probability under one of the distributions also has positive probability under the other distribution
3235
33- It plays a fundamental role in statistics, information theory, and machine learning.
36+ * thus, there is no "smoking gun" event whose occurrence tells a statistician that one of the probability distribution surely governs the data
3437
35- This lecture explores three fundamental divergence measures and their connections to later lectures:
38+ Statistical divergence functions play important roles in statistics, information theory, and what many people now call "machine learning".
39+
40+ This lecture describes three divergence measures:
3641
3742* ** Kullback–Leibler (KL) divergence**
3843* ** Jensen–Shannon (JS) divergence**
3944* ** Chernoff entropy**
4045
46+ These will appear in several quantecon lectures.
47+
4148Let's start by importing the necessary Python tools.
4249
4350``` {code-cell} ipython3
@@ -103,7 +110,7 @@ plt.show()
103110(rel_entropy)=
104111## Kullback–Leibler divergence
105112
106- The first measure is the ** Kullback–Leibler (KL) divergence** .
113+ Our first divergence function is the ** Kullback–Leibler (KL) divergence** .
107114
108115For probability densities (or pmfs) $f$ and $g$ it is defined by
109116
0 commit comments