Here is the list of few probabilistic results.

Markov’s inequality

If \(X\) is a nonnegative random variable and \(a>0\), then the probabilty that \(X\) is at least \(a\) is at most the expectation of \(X\) divided by \(a\): $$P(X\geq a) \leq \frac{E(X)}{a}.$$

Proof

$$ \begin{aligned} E(X) &= \int_{-\infty}^\infty f(x)xdx = \int_0^\infty f(x)xdx\\ &= \int_0^a f(x)xdx + \int_a^\infty f(x)xdx\\ &\geq \int_a^\infty f(x)xdx \\ &\geq \int_a^\infty f(x)dx \\ &=a\int_a^\infty f(x)dx \\ &= aP(X\geq a), \end{aligned} $$ which, as you can see, is quite weak bound and proof itself is quite a troll.


Chebyshev’s inequality

Let \(X\) be a random variable with finite non-zero variance \(\sigma^2\). Then for any real number \(k>0\), $$\text{Pr}(\vert X-\mu \vert \geq k\sigma) \leq \frac{1}{k^2}.$$

Proof

$$ \begin{aligned} \text{Pr}(\vert X-\mu\vert \geq k\sigma) &= \text{Pr}((X-\mu)^2\geq k^2\sigma^2) \\ &\leq \frac{E[(X-\mu)^2]}{k^2\sigma^2} = \frac{D(X)}{k^2\sigma^2} \\ &= \frac{1}{k^2}, \end{aligned} $$ where we used Markov’s inequality.


Law of large numbers

Let \(X_1,X_2,\ldots\) be independent random variables with expected value \(EX_i = \mu\) and dispersion \(DX_i = \sigma^2<\infty\). Then, $$\frac{S_n}{n}\overset P \rightarrow \mu.$$

Proof

Using Chebyshev’s theorem, we get that $$ \begin{aligned} P(\vert \frac{S_n}{n} - \mu \vert \geq \varepsilon) &\leq \frac{D(\frac{S_n}{n})}{\varepsilon^2}\\ &=\frac{n\sigma^2}{n^2\varepsilon^2}. \end{aligned} $$ As \(n\rightarrow \infty\) and \(\varepsilon \rightarrow 0\), we get that \(\frac{S_n}{n}\rightarrow \mu\).


Jensen’s inequality

If \(g\) is a convex function and \(X\) is a random variable, $$E[g(X)] \geq g(E[X]).$$

De Moivre-Laplace theorem

Let \(S_n\) be binomial distribution \(S_n\sim B(n,p)\), then for all \(t\in\mathbb R\), $$P[\frac{S_n-np}{\sqrt{np(1-p)}}\leq t]\to \Phi(t)$$ as \(n\to \infty\).

Poisson limit theorem

Let \(p_n\) be sequence of real numbers in \([0,1]\) such that the sequence \(np_n\) converges to a finite limit \(\lambda\). Then: $$\lim_{n\to \infty} P(X_n = k) =\lim_{n\to \infty}\binom{n}{k}p_n^k(1-p_n)^{n-k} \to \frac{\lambda^k}{k!}e^{-\lambda}.$$