Commit ab95621b authored by Gerwin Klein's avatar Gerwin Klein
Browse files

merge from afp-2021

......@@ -10240,3 +10240,29 @@ abstract =
modeling the behavior of perfect logicians and formalize a solution of
the puzzle.
 
[Laws_of_Large_Numbers]
title = The Laws of Large Numbers
author = Manuel Eberl <https://www21.in.tum.de/~eberlm>
topic = Mathematics/Probability theory
date = 2021-02-10
notify = eberlm@in.tum.de
abstract =
<p>The Law of Large Numbers states that, informally, if one
performs a random experiment $X$ many times and takes the average of
the results, that average will be very close to the expected value
$E[X]$.</p> <p> More formally, let
$(X_i)_{i\in\mathbb{N}}$ be a sequence of independently identically
distributed random variables whose expected value $E[X_1]$ exists.
Denote the running average of $X_1, \ldots, X_n$ as $\overline{X}_n$.
Then:</p> <ul> <li>The Weak Law of Large Numbers
states that $\overline{X}_{n} \longrightarrow E[X_1]$ in probability
for $n\to\infty$, i.e. $\mathcal{P}(|\overline{X}_{n} - E[X_1]| >
\varepsilon) \longrightarrow 0$ as $n\to\infty$ for any $\varepsilon
> 0$.</li> <li>The Strong Law of Large Numbers states
that $\overline{X}_{n} \longrightarrow E[X_1]$ almost surely for
$n\to\infty$, i.e. $\mathcal{P}(\overline{X}_{n} \longrightarrow
E[X_1]) = 1$.</li> </ul> <p>In this entry, I
formally prove the strong law and from it the weak law. The approach
used for the proof of the strong law is a particularly quick and slick
one based on ergodic theory, which was formalised by Gouëzel in
another AFP entry.</p>
(*
File: Laws_of_Large_Numbers.thy
Author: Manuel Eberl, TU München
*)
section \<open>The Laws of Large Numbers\<close>
theory Laws_of_Large_Numbers
imports Ergodic_Theory.Shift_Operator
begin
text \<open>
We prove the strong law of large numbers in the following form: Let $(X_i)_{i\in\mathbb{N}}$
be a sequence of i.i.d. random variables over a probability space \<open>M\<close>. Further assume that
the expected value $E[X_0]$ of $X_0$ exists. Then the sequence of random variables
\[\overline{X}_n = \frac{1}{n} \sum_{i=0}^n X_i\]
of running averages almost surely converges to $E[X_0]$.
This means that
\[\mathcal{P}[\overline{X}_n \longrightarrow E[X_0]] = 1\ .\]
We start with the strong law.
\<close>
subsection \<open>The strong law\<close>
text \<open>
The proof uses Birkhoff's Theorem from Gouëzel's formalisation of ergodic theory~\cite{gouezel}
and the fact that the shift operator $T(x_1, x_2, x_3, \ldots) = (x_2, x_3, \ldots)$ is ergodic.
This proof can be found in various textbooks on probability theory/ergodic
theory, e.g. the ones by Krengel~\cite[p.~24]{krengel} and
Simmonet~\cite[Chapter 15, pp.~311--325]{Simonnet1996}.
\<close>
theorem (in prob_space) strong_law_of_large_numbers_iid:
fixes X :: "nat \<Rightarrow> 'a \<Rightarrow> real"
assumes indep: "indep_vars (\<lambda>_. borel) X UNIV"
assumes distr: "\<And>i. distr M borel (X i) = distr M borel (X 0)"
assumes L1: "integrable M (X 0)"
shows "AE x in M. (\<lambda>n. (\<Sum>i<n. X i x) / n) \<longlonglongrightarrow> expectation (X 0)"
proof -
text \<open>
We adopt a more explicit view of \<^term>\<open>M\<close> as a countably infinite product of i.i.d.
random variables, indexed by the natural numbers:
\<close>
define M' :: "(nat \<Rightarrow> real) measure" where "M' = Pi\<^sub>M UNIV (\<lambda>i. distr M borel (X i))"
have [measurable]: "random_variable borel (X i)" for i
using indep by (auto simp: indep_vars_def)
have M'_eq: "M' = distr M (Pi\<^sub>M UNIV (\<lambda>i. borel)) (\<lambda>x. \<lambda>i\<in>UNIV. X i x)"
using indep unfolding M'_def by (subst (asm) indep_vars_iff_distr_eq_PiM) auto
have space_M': "space M' = UNIV"
by (simp add: M'_def space_PiM)
have sets_M' [measurable_cong]: "sets M' = sets (Pi\<^sub>M UNIV (\<lambda>i. borel))"
by (simp add: M'_eq)
interpret M': prob_space M'
unfolding M'_eq by (intro prob_space_distr) auto
text \<open>We introduce a shift operator that forgets the first variable in the sequence.\<close>
define T :: "(nat \<Rightarrow> real) \<Rightarrow> (nat \<Rightarrow> real)" where
"T = (\<lambda>f. f \<circ> Suc)"
have funpow_T: "(T ^^ i) = (\<lambda>f. f \<circ> (\<lambda>n. n + i))" for i
by (induction i) (auto simp: T_def)
interpret T: shift_operator_ergodic "distr M borel (X 0)" T M'
proof -
interpret X0: prob_space "distr M borel (X 0)"
by (rule prob_space_distr) auto
show "shift_operator_ergodic (distr M borel (X 0))"
by unfold_locales
show "M' \<equiv> Pi\<^sub>M UNIV (\<lambda>_. distr M borel (X 0)) "
unfolding M'_def by (subst distr)
qed (simp_all add: T_def)
have [intro]: "integrable M' (\<lambda>f. f 0)"
unfolding M'_eq by (subst integrable_distr_eq) (use L1 in auto)
have "AE f in M'. (\<lambda>n. T.birkhoff_sum (\<lambda>f. f 0) n f / real n)
\<longlonglongrightarrow> real_cond_exp M' T.Invariants (\<lambda>f. f 0) f"
by (rule T.birkhoff_theorem_AE_nonergodic) auto
moreover have "AE x in M'. real_cond_exp M' T.Invariants (\<lambda>f. f 0) x =
M'.expectation (\<lambda>f. f 0) / M'.prob (space M')"
by (intro T.Invariants_cond_exp_is_integral_fmpt) auto
ultimately have "AE f in M'. (\<lambda>n. T.birkhoff_sum (\<lambda>f. f 0) n f / real n)
\<longlonglongrightarrow> M'.expectation (\<lambda>f. f 0)"
by eventually_elim (simp_all add: M'.prob_space)
also have "M'.expectation (\<lambda>f. f 0) = expectation (X 0)"
unfolding M'_eq by (subst integral_distr) simp_all
also have "T.birkhoff_sum (\<lambda>f. f 0) = (\<lambda>n f. sum f {..<n})"
by (intro ext) (simp_all add:T.birkhoff_sum_def funpow_T)
finally show ?thesis
unfolding M'_eq by (subst (asm) AE_distr_iff) simp_all
qed
subsection \<open>The weak law\<close>
text \<open>
To go from the strong law to the weak one, we need the fact that almost sure convergence
implies convergence in probability. We prove this for sequences of random variables here.
\<close>
lemma (in prob_space) AE_convergence_imp_convergence_in_prob:
assumes [measurable]: "\<And>i. random_variable borel (X i)" "random_variable borel Y"
assumes AE: "AE x in M. (\<lambda>i. X i x) \<longlonglongrightarrow> Y x"
assumes "\<epsilon> > (0 :: real)"
shows "(\<lambda>i. prob {x\<in>space M. \<bar>X i x - Y x\<bar> > \<epsilon>}) \<longlonglongrightarrow> 0"
proof -
define A where "A = (\<lambda>i. {x\<in>space M. \<bar>X i x - Y x\<bar> > \<epsilon>})"
define B where "B = (\<lambda>n. (\<Union>i\<in>{n..}. A i))"
have [measurable]: "A i \<in> sets M" "B i \<in> sets M" for i
unfolding A_def B_def by measurable
have "AE x in M. x \<notin> (\<Inter>i. B i)"
using AE unfolding B_def A_def
by eventually_elim
(use \<open>\<epsilon> > 0\<close> in \<open>fastforce simp: tendsto_iff dist_norm eventually_at_top_linorder\<close>)
hence "(\<Inter>i. B i) \<in> null_sets M"
by (subst AE_iff_null_sets) auto
show "(\<lambda>i. prob (A i)) \<longlonglongrightarrow> 0"
proof (rule Lim_null_comparison)
have "(\<lambda>i. prob (B i)) \<longlonglongrightarrow> prob (\<Inter>i. B i)"
proof (rule finite_Lim_measure_decseq)
show "decseq B"
by (rule decseq_SucI) (force simp: B_def)
qed auto
also have "prob (\<Inter>i. B i) = 0"
using \<open>(\<Inter>i. B i) \<in> null_sets M\<close> by (simp add: measure_eq_0_null_sets)
finally show "(\<lambda>i. prob (B i)) \<longlonglongrightarrow> 0" .
next
have "prob (A n) \<le> prob (B n)" for n
unfolding B_def by (intro finite_measure_mono) auto
thus "\<forall>\<^sub>F n in at_top. norm (prob (A n)) \<le> prob (B n)"
by (intro always_eventually) auto
qed
qed
text \<open>
The weak law is now a simple corollary: we again have the same setting as before. The weak
law now states that $\overline{X}_n$ converges to $E[X_0]$ in probability. This means that
for any \<open>\<epsilon> > 0\<close>, the probability that $|\overline{X}_n - X_0| > \varepsilon$ vanishes as
\<open>n \<rightarrow> \<infinity>\<close>.
\<close>
corollary (in prob_space) weak_law_of_large_numbers_iid:
fixes X :: "nat \<Rightarrow> 'a \<Rightarrow> real" and \<epsilon> :: real
assumes indep: "indep_vars (\<lambda>_. borel) X UNIV"
assumes distr: "\<And>i. distr M borel (X i) = distr M borel (X 0)"
assumes L1: "integrable M (X 0)"
assumes "\<epsilon> > 0"
shows "(\<lambda>n. prob {x\<in>space M. \<bar>(\<Sum>i<n. X i x) / n - expectation (X 0)\<bar> > \<epsilon>}) \<longlonglongrightarrow> 0"
proof (rule AE_convergence_imp_convergence_in_prob)
show "AE x in M. (\<lambda>n. (\<Sum>i<n. X i x) / n) \<longlonglongrightarrow> expectation (X 0)"
by (rule strong_law_of_large_numbers_iid) fact+
next
have [measurable]: "random_variable borel (X i)" for i
using indep by (auto simp: indep_vars_def)
show "random_variable borel (\<lambda>x. (\<Sum>i<n. X i x) / real n)" for n
by measurable
qed (use \<open>\<epsilon> > 0\<close> in simp_all)
end
\ No newline at end of file
(*
File: Laws_of_Large_Numbers.thy
Author: Manuel Eberl, TU München
*)
subsection \<open>Example\<close>
theory Laws_of_Large_Numbers_Example
imports Laws_of_Large_Numbers
begin
text \<open>
As an example, we apply the strong law to the proportion of successes in an independent sequence
of coin flips with success probability \<open>p\<close>. We will show that proportion of successful coin
flips among the first \<open>n\<close> attempts almost surely converges to \<open>p\<close> as \<open>n \<rightarrow> \<infinity>\<close>.
\<close>
(* TODO: Move *)
lemma (in prob_space) indep_vars_iff_distr_eq_PiM':
fixes I :: "'i set" and X :: "'i \<Rightarrow> 'a \<Rightarrow> 'b"
assumes "I \<noteq> {}"
assumes rv: "\<And>i. i \<in> I \<Longrightarrow> random_variable (M' i) (X i)"
shows "indep_vars M' X I \<longleftrightarrow>
distr M (\<Pi>\<^sub>M i\<in>I. M' i) (\<lambda>x. \<lambda>i\<in>I. X i x) = (\<Pi>\<^sub>M i\<in>I. distr M (M' i) (X i))"
proof -
from assms obtain j where j: "j \<in> I"
by auto
define N' where "N' = (\<lambda>i. if i \<in> I then M' i else M' j)"
define Y where "Y = (\<lambda>i. if i \<in> I then X i else X j)"
have rv: "random_variable (N' i) (Y i)" for i
using j by (auto simp: N'_def Y_def intro: assms)
have "indep_vars M' X I = indep_vars N' Y I"
by (intro indep_vars_cong) (auto simp: N'_def Y_def)
also have "\<dots> \<longleftrightarrow> distr M (\<Pi>\<^sub>M i\<in>I. N' i) (\<lambda>x. \<lambda>i\<in>I. Y i x) = (\<Pi>\<^sub>M i\<in>I. distr M (N' i) (Y i))"
by (intro indep_vars_iff_distr_eq_PiM rv assms)
also have "(\<Pi>\<^sub>M i\<in>I. N' i) = (\<Pi>\<^sub>M i\<in>I. M' i)"
by (intro PiM_cong) (simp_all add: N'_def)
also have "(\<lambda>x. \<lambda>i\<in>I. Y i x) = (\<lambda>x. \<lambda>i\<in>I. X i x)"
by (simp_all add: Y_def fun_eq_iff)
also have "(\<Pi>\<^sub>M i\<in>I. distr M (N' i) (Y i)) = (\<Pi>\<^sub>M i\<in>I. distr M (M' i) (X i))"
by (intro PiM_cong distr_cong) (simp_all add: N'_def Y_def)
finally show ?thesis .
qed
(* TODO: Move *)
lemma indep_vars_PiM_components:
assumes "\<And>i. i \<in> A \<Longrightarrow> prob_space (M i)"
shows "prob_space.indep_vars (PiM A M) M (\<lambda>i f. f i) A"
proof (cases "A = {}")
case False
have "distr (Pi\<^sub>M A M) (Pi\<^sub>M A M) (\<lambda>x. restrict x A) = distr (Pi\<^sub>M A M) (Pi\<^sub>M A M) (\<lambda>x. x)"
by (intro distr_cong) (auto simp: restrict_def space_PiM PiE_def extensional_def Pi_def)
also have "\<dots> = Pi\<^sub>M A M"
by simp
also have "\<dots> = Pi\<^sub>M A (\<lambda>i. distr (Pi\<^sub>M A M) (M i) (\<lambda>f. f i))"
by (intro PiM_cong refl, subst distr_PiM_component) (auto simp: assms)
finally show ?thesis
by (subst prob_space.indep_vars_iff_distr_eq_PiM') (simp_all add: prob_space_PiM assms False)
next
case True
interpret prob_space "PiM A M"
by (intro prob_space_PiM assms)
show ?thesis
unfolding indep_vars_def indep_sets_def by (auto simp: True)
qed
(* TODO: Move *)
lemma indep_vars_PiM_components':
assumes "\<And>i. i \<in> A \<Longrightarrow> prob_space (M i)"
assumes "\<And>i. i \<in> A \<Longrightarrow> g i \<in> M i \<rightarrow>\<^sub>M N i"
shows "prob_space.indep_vars (PiM A M) N (\<lambda>i f. g i (f i)) A"
by (rule prob_space.indep_vars_compose2[OF prob_space_PiM indep_vars_PiM_components])
(use assms in simp_all)
(* TODO: Move *)
lemma integrable_bernoulli_pmf [intro]:
fixes f :: "bool \<Rightarrow> 'a :: {banach, second_countable_topology}"
shows "integrable (bernoulli_pmf p) f"
by (rule integrable_measure_pmf_finite) auto
(* TODO: Move *)
lemma expectation_bernoulli_pmf:
fixes f :: "bool \<Rightarrow> 'a :: {banach, second_countable_topology}"
assumes p: "p \<in> {0..1}"
shows "measure_pmf.expectation (bernoulli_pmf p) f = p *\<^sub>R f True + (1 - p) *\<^sub>R f False"
using p by (subst integral_measure_pmf[of UNIV]) (auto simp: UNIV_bool)
experiment
fixes p :: real
assumes p: "p \<in> {0..1}"
begin
definition M :: "(nat \<Rightarrow> bool) measure"
where "M = (\<Pi>\<^sub>M i\<in>(UNIV :: nat set). measure_pmf (bernoulli_pmf p))"
definition X :: "nat \<Rightarrow> (nat \<Rightarrow> bool) \<Rightarrow> real"
where "X = (\<lambda>i f. if f i then 1 else 0)"
interpretation prob_space M
unfolding M_def by (intro prob_space_PiM measure_pmf.prob_space_axioms)
lemma random_variable_component: "random_variable (count_space UNIV) (\<lambda>f. f i)"
unfolding X_def M_def by measurable
lemma random_variable_X [measurable]: "random_variable borel (X i)"
unfolding X_def M_def by measurable
lemma distr_M_component: "distr M (count_space UNIV) (\<lambda>f. f i) = measure_pmf (bernoulli_pmf p)"
proof -
have "distr M (count_space UNIV) (\<lambda>f. f i) = distr M (measure_pmf (bernoulli_pmf p)) (\<lambda>f. f i)"
by (rule distr_cong) auto
also have "\<dots> = measure_pmf (bernoulli_pmf p)"
unfolding M_def by (subst distr_PiM_component) (simp_all add: measure_pmf.prob_space_axioms)
finally show ?thesis .
qed
lemma distr_M_X:
"distr M borel (X i) = distr (measure_pmf (bernoulli_pmf p)) borel (\<lambda>b. if b then 1 else 0)"
proof -
have "distr M borel (X i) = distr (distr M (count_space UNIV) (\<lambda>f. f i))
borel (\<lambda>b. if b then 1 else 0 :: real)"
by (subst distr_distr) (auto simp: M_def X_def o_def)
also note distr_M_component[of i]
finally show ?thesis
by simp
qed
lemma X_has_expectation: "integrable M (X 0)"
proof -
have "integrable (bernoulli_pmf p) (\<lambda>b. if b then 1 else 0 :: real)"
by auto
also have "measure_pmf (bernoulli_pmf p) = distr M (count_space UNIV) (\<lambda>f. f 0)"
by (simp add: distr_M_component)
also have "integrable \<dots> (\<lambda>b. if b then 1 else 0 :: real) = integrable M (X 0)"
unfolding X_def using random_variable_component by (subst integrable_distr_eq) auto
finally show ?thesis .
qed
lemma indep: "indep_vars (\<lambda>_. borel) X UNIV"
unfolding M_def X_def
by (rule indep_vars_PiM_components') (simp_all add: measure_pmf.prob_space_axioms)
lemma expectation_X: "expectation (X i) = p"
proof -
have "expectation (X i) =
lebesgue_integral (distr M (count_space UNIV) (\<lambda>f. f i)) (\<lambda>b. if b then 1 else 0 :: real)"
by (subst integral_distr) (simp_all add: random_variable_component X_def)
also have "distr M (count_space UNIV) (\<lambda>x. x i) = measure_pmf (bernoulli_pmf p)"
by (rule distr_M_component)
also have "measure_pmf.expectation (bernoulli_pmf p) (\<lambda>b. if b then 1 else 0 :: real) = p"
using p by (subst integral_bernoulli_pmf) auto
finally show ?thesis .
qed
theorem "AE f in M. (\<lambda>n. card {i. i < n \<and> f i} / n) \<longlonglongrightarrow> p"
proof -
have "AE f in M. (\<lambda>n. (\<Sum>i<n. X i f) / real n) \<longlonglongrightarrow> expectation (X 0)"
by (rule strong_law_of_large_numbers_iid)
(use indep X_has_expectation in \<open>simp_all add: distr_M_X\<close>)
also have "expectation (X 0) = p"
by (simp add: expectation_X)
also have "(\<lambda>x n. \<Sum>i<n. X i x) = (\<lambda>x n. \<Sum>i\<in>{i\<in>{..<n}. x i}. 1)"
by (intro ext sum.mono_neutral_cong_right) (auto simp: X_def)
also have "\<dots> = (\<lambda>x n. real (card {i. i < n \<and> x i}))"
by simp
finally show ?thesis .
qed
end
end
\ No newline at end of file
chapter AFP
session "Laws_of_Large_Numbers" (AFP) = "Ergodic_Theory" +
options [timeout = 600]
theories
Laws_of_Large_Numbers
Laws_of_Large_Numbers_Example
document_files
"root.tex"
"root.bib"
@book{Simonnet1996,
author="Simonnet, Michel",
title="Measures and Probabilities",
year="1996",
publisher="Springer New York",
address="New York, NY",
isbn="978-1-4612-4012-9",
doi="10.1007/978-1-4612-4012-9_15",
}
@article{gouezel,
author = {Sébastien Gouëzel},
title = {Ergodic Theory},
journal = {Archive of Formal Proofs},
month = dec,
year = 2015,
note = {\url{https://isa-afp.org/entries/Ergodic_Theory.html},
Formal proof development},
ISSN = {2150-914x},
}
@book{krengel,
doi = {10.1515/9783110844641},
year = {1985},
month = jan,
publisher = {De Gruyter},
author = {Ulrich Krengel},
title = {Ergodic Theorems},
pages= {24}
}
\documentclass[11pt,a4paper]{article}
\usepackage{isabelle,isabellesym}
\usepackage{mathtools}
\usepackage{amssymb}
\usepackage{stmaryrd}
\usepackage[numbers]{natbib}
% this should be the last package used
\usepackage{pdfsetup}
\usepackage{doi}
% urls in roman style, theory text in math-similar italics
\urlstyle{rm}
\isabellestyle{it}
\DeclarePairedDelimiter{\norm}{\lVert}{\rVert}
\begin{document}
\nocite{Simonnet1996}
\nocite{krengel}
\title{The Laws of Large Numbers}
\author{Manuel Eberl}
\date{}
\maketitle
\begin{abstract}
The Law of Large Numbers states that, informally, if one performs a random experiment $X$ many times and takes the average of the results, that average will be very close to the expected value $E[X]$.
More formally, let $(X_i)_{i\in\mathbb{N}}$ be a sequence of independently identically distributed random variables whose expected value $E[X_1]$ exists. Denote the running average of $X_1, \ldots, X_n$ for $\overline{X}_n$. Then:
\begin{itemize}
\item The Weak Law of Large Numbers states that $\overline{X}_{\!n} \longrightarrow E[X_1]$ in probability for $n\to\infty$, i.e. $\mathcal{P}(|\overline{X}_{\!n} - E[X_1]| > \varepsilon) \longrightarrow 0$ for $n\to\infty$ for any $\varepsilon > 0$.
\item The Strong Law of Large Numbers states that $\overline{X}_{\!n} \longrightarrow E[X_1]$ almost surely for $n\to\infty$, i.e. $\mathcal{P}(\overline{X}_{\!n} \longrightarrow E[X_1]) = 1$.
\end{itemize}
In this entry, I formally prove the strong law and from it the weak law. The approach used for the proof of the strong law is a particularly quick and slick one based on ergodic theory, which was formalised by Gou\"ezel in another AFP entry.
\end{abstract}
\tableofcontents
% sane default for proof documents
\parindent 0pt\parskip 0.5ex
% generated text of all theories
\input{session}
\vspace{2em}
\textbf{Acknowledgements.} I thank Sébastien Gouëzel for providing advice and context about the
law of large numbers and ergodic theory. I do not actually know any ergodic theory and without him,
I would probably have shied away from formalising this.
% optional bibliography
{\raggedright
\bibliographystyle{plainnat}
\bibliography{root}
}
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:
......@@ -297,6 +297,7 @@ Laplace_Transform
Latin_Square
LatticeProperties
Launchbury
Laws_of_Large_Numbers
Lazy-Lists-II
Lazy_Case
Lehmer
......
......@@ -97,6 +97,14 @@ MathJax = {
</td>
</tr>
<tr>
<td class="datahead">
Contributor:
</td>
<td class="data">
<a href="https://www21.in.tum.de/~eberlm">Manuel Eberl</a>
</td>
</tr>
<tr>
......@@ -132,7 +140,7 @@ MathJax = {
<tr><td class="datahead">Used by:</td>
<td class="data"><a href="Gromov_Hyperbolicity.html">Gromov_Hyperbolicity</a>, <a href="Lp.html">Lp</a> </td></tr>
<td class="data"><a href="Gromov_Hyperbolicity.html">Gromov_Hyperbolicity</a>, <a href="Laws_of_Large_Numbers.html">Laws_of_Large_Numbers</a>, <a href="Lp.html">Lp</a> </td></tr>
......
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>The Laws of Large Numbers - Archive of Formal Proofs
</title>
<link rel="stylesheet" type="text/css" href="../front.css">
<link rel="icon" href="../images/favicon.ico" type="image/icon">
<link rel="alternate" type="application/rss+xml" title="RSS" href="../rss.xml">
<!-- MathJax for LaTeX support in abstracts -->
<script>
MathJax = {
tex: {
inlineMath: [['$', '$'], ['\\(', '\\)']]
},
processEscapes: true,
svg: {
fontCache: 'global'
}
};
</script>
<script id="MathJax-script" async src="../components/mathjax/es5/tex-mml-chtml.js"></script>
</head>
<body class="mathjax_ignore">
<table width="100%">
<tbody>
<tr>
<!-- Navigation -->
<td width="20%" align="center" valign="top">
<p>&nbsp;</p>
<a href="https://www.isa-afp.org/">
<img src="../images/isabelle.png" width="100" height="88" border=0>
</a>
<p>&nbsp;</p>
<p>&nbsp;</p>
<table class="nav" width="80%">
<tr>
<td class="nav" width="100%"><a href="../index.html">Home</a></td>
</tr>
<tr>
<td class="nav"><a href="../about.html">About</a></td>
</tr>
<tr>
<td class="nav"><a href="../submitting.html">Submission</a></td>
</tr>
<tr>
<td class="nav"><a href="../updating.html">Updating Entries</a></td>
</tr>
<tr>
<td class="nav"><a href="../using.html">Using Entries</a></td>
</tr>
<tr>
<td class="nav"><a href="../search.html">Search</a></td>
</tr>
<tr>
<td class="nav"><a href="../statistics.html">Statistics</a></td>
</tr>
<tr>
<td class="nav"><a href="../topics.html">Index</a></td>
</tr>
<tr>
<td class="nav"><a href="../download.html">Download</a></td>
</tr>
</table>
<p>&nbsp;</p>
<p>&nbsp;</p>
</td>
<!-- Content -->
<td width="80%" valign="top">
<div align="center">
<p>&nbsp;</p>
<h1> <font class="first">T</font>he
<font class="first">L</font>aws
of
<font class="first">L</font>arge
<font class="first">N</font>umbers
</h1>
<p>&nbsp;</p>
<table width="80%" class="data">
<tbody>
<tr>
<td class="datahead" width="20%">Title:</td>
<td class="data" width="80%">The Laws of Large Numbers</td>
</tr>
<tr>
<td class="datahead">
Author:
</td>
<td class="data">
<a href="https://www21.in.tum.de/~eberlm">Manuel Eberl</a>
</td>
</tr>
<tr>
<td class="datahead">Submission date:</td>
<td class="data">2021-02-10</td>
</tr>
<tr>
<td class="datahead" valign="top">Abstract:</td>