Logo notas.itmens

Expectations

Given a random variable \(X\geq 0\) on \((\Omega,\mathcal{F},P)\), the expectation or expected value \(EX\) of \(X\) is defined to be 

\[EX=\int X dP.\]

If \(X\) can be negative, let \(EX = EX^+ - EX^-\), namely the difference of the positive part and the negative part (computed positively). \(EX\) is often called the mean of \(X\). Trivially for any real number \(b\) we have \(Eb = b\). Furthermore, for \(X,Y\geq 0\) and \(E|X|, E|Y|\) finite, we have

  • \(E(X+Y) = EX + EY\).
  • \(E(aX+b) = a E(X) + b\) for \(a,b\) real.
  • If \(X\geq Y\) then \(EX\geq EY\).

If we only integrate over \(A\subset \Omega\), we write

\[E(X;A) = \int_A XdP.\]

The variance of \(X\) is \(EX^2 = E(X^2)\). The \(n\)-th moment of \(X\) is \(EX^n\).

Inequalities#

Jensen's inequality#

If \(\phi\) is convex, namely \(\sum_i a_i \phi(x_i) \geq \phi(\sum_i a_i\phi(x_i))\) for all \(a_i\in (0,1)\) and \(\sum_i a_i =1\) together with \(x_i\in \mathbb{R}\), then

\[E(\phi(X)) \geq \phi(EX)\]

given that both expectations exist.

Special cases: 

  • \(|EX|\leq E|X|\) (the absolute value function is convex)
  • \((EX)^2 \leq E(X^2)\) (\((\cdot)^2\) is convex). 

Holder's inequality#

If \(p,q\in[1,\infty]\) and \(1/p+ 1/q =1\), then

\[E|XY|\leq ||X||_p ||Y||_q\]

where \(||X||_r = (E|X|^r)^{1/r}\) for \(r\in [1,\infty)\) and \(||X||_\infty = \text{inf}\{M:P(|X|>M)=0\}\). When \(p=q=2\) we have the Cauchy-Schwarz inequality

\[E|XY|\leq(EX^2EY^2)^{1/2}.\]

Chebyshev's inequality#

Suppose \(\phi:\mathbb{R}\to\mathbb{R}\) has \(\phi\geq 0\) and let \(A\in\mathcal{B}(\mathbb{R})\). Let \(i_A = \text{inf}\{\phi(y):y\in A\}\), then we have

\[i_A P(X\in A)\leq E(\phi(X);X\in A) \leq E\phi(X).\]