You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/likelihood_ratio_process.md
+139-7Lines changed: 139 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -36,9 +36,11 @@ Among things that we'll learn are
36
36
* A peculiar property of likelihood ratio processes
37
37
* How a likelihood ratio process is a key ingredient in frequentist hypothesis testing
38
38
* How a **receiver operator characteristic curve** summarizes information about a false alarm probability and power in frequentist hypothesis testing
39
+
* How a Bayesian statistician combines frequentist probabilities of type I and type II errors to form posterior probabilities of erroneous model selection or missclassification of individuals
39
40
* How during World War II the United States Navy devised a decision rule that Captain Garret L. Schyler challenged, a topic to be studied in {doc}`this lecture <wald_friedman>`
We now describe how a Bayesian statistician can combine frequentist probabilities of type I and type II errors in order to
724
+
725
+
* compute a posterior probability of selecting a wrong model
726
+
* compute an anticipated error rate in a classification problem
727
+
728
+
We consider a situation in which nature generates data by mixing known densities $f$ and $g$ with known mixing
729
+
parameter $\pi_{-1} \in (0,1)$ so that the random variable $w$ is drawn from the density
730
+
731
+
$$
732
+
h (w) = \pi_{-1} f(w) + (1-\pi_{-1}) g(w)
733
+
$$
734
+
735
+
We'll often set $\pi_{-1} = .5$.
736
+
737
+
We assume that $f$ and $g$ both put positive probabilities on the same intervals of possible realizations of the random variable $W$.
738
+
739
+
740
+
We consider two alternative timing protocols.
741
+
742
+
**Protocol 1:** Nature flips a coin once at time $t=-1$ and with probability $\pi_{-1}$ generates a sequence $\{w_t\}_{t=1}^T$
743
+
of IID draws from $f$ and with probability $1-\pi_{-1}$ generates a sequence $\{w_t\}_{t=1}^T$
744
+
of IID draws from $g$.
745
+
746
+
**Protocol 2.** At each time $t \geq 0$, nature flips a coin and with probability $\pi_{-1}$ draws $w_t$ from $f$ and with probability $1-\pi_{-1}$ draws $w_t$ from $g$.
747
+
748
+
**Remark:** Under protocol 2, the $\{w_t\}_{t=1}^T$ is a sequence of IID draws from $h(w)$. Under protocol 1, the the $\{w_t\}_{t=1}^T$ is
749
+
not IID. It is **conditionally IID** -- meaning that with probability $\pi_{-1}$ it is a sequence of IID draws from $f(w)$ and with probability $1-\pi_{-1}$ it is a sequence of IID draws from $g(w)$. For more about this, see {doc}`this lecture about exchangeability <exchangeable>`.
750
+
751
+
We again deploy a **likelihood ratio process** with time $t$ component being the likelihood ratio
We can form a Bayesian prior probability that the likelihood ratio selects the wrong model by assigning a prior probability of $\pi_{-1} = .5$ that it selects the wrong model and then averaging $p_f$ and $p_g$ to form the Bayesian posterior probability of a detection error equal to
Likelihood processes play an important role in Bayesian learning, as described in {doc}`this lecture <likelihood_bayes>`
692
820
and as applied in {doc}`this lecture <odu>`.
693
821
694
822
Likelihood ratio processes appear again in [this lecture](https://python-advanced.quantecon.org/additive_functionals.html), which contains another illustration
695
823
of the **peculiar property** of likelihood ratio processes described above.
0 commit comments