Skip to content

Commit 3e7ef36

Browse files
committed
update lecrture
1 parent 4c4357b commit 3e7ef36

6 files changed

Lines changed: 161 additions & 97 deletions

File tree

-1.56 KB
Binary file not shown.
-2.24 KB
Loading

lectures/_static/lecture_specific/wald_friedman/wald_dec_rule.tex

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
\node[below, outer sep=5pt] at (a1){$B$};
2727
\node[circle, draw, thin, blue, fill=white!10, scale=0.45] at (a2){};
2828
\node[below, outer sep=5pt] at (a2){$A$};
29-
\node[below, outer sep=25pt] at (1.5, 0){values of $\log(L_m)$};
29+
\node[below, outer sep=25pt] at (1.5, 0){value of $L_m$};
3030
\end{tikzpicture}
3131

3232
\end{document}

lectures/likelihood_ratio_process.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -544,6 +544,7 @@ control tests during World War II.
544544
A Navy Captain who had been ordered to perform tests of this kind had doubts about it that he
545545
presented to Milton Friedman, as we describe in {doc}`this lecture <wald_friedman>`.
546546

547+
(rel_entropy)=
547548
## Kullback–Leibler Divergence
548549

549550
Now let’s consider a case in which neither $g$ nor $f$

0 commit comments

Comments
 (0)