Law of Large Numbers#

This section talks about the Law of Large Numbers. The law of large numbers is a theorem that describes the result of performing the same experiment a large number of times. It states that the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.

We will discuss two versions of the law: the weak law and the strong law. We will also introduce two forms of convergence: convergence in probability and almost sure convergence [Chan, 2021].

Why is it Useful in the Context of Machine Learning?#

We will soon see that this along with the probability bounds will build up to answer the most fundamental problem in Machine Learning: Is Learning Possible?

See Learning Theory section

Real Analysis#

We will not go too deep into proving all the concepts in this section as it requires some knowledge of Real Analysis.