Independence
With increasing generality:
- Two events \(A\) and \(B\) are independent if \(P(A\cap B) = P(A)P(B)\).
- Two random variables \(X\) and \(Y\) are independent if for all \(C,D\in\mathcal{B}(\mathbb{R})\) we have
Namely the events \(A=\{X\in C\}, B=\{Y\in D\}\) are independent.
Generally, two \(\sigma\)-algebras \(\mathcal{F}.\mathcal{G}\) are independent if for all \(A\in\mathcal{F}\) and \(B\in\mathcal{G}\), \(A\) and \(B\) are independent.
Most generally, collections of sets \(\mathcal{A}_i\subset \mathcal{F}\) for \(i=1,…,n\) are said to be independent if whenever \(A_i\in\mathcal{A}_i\) and \(I\subset\{1,\ldots,n\}\) we have \(P(\cap_{i\in I}A_i) = \prod_{i\in I}P(A_i)\).
Note that for the independence of \(n\) events, it is not enough to assume \(P(A_i\cap A_j) = P(A_i)P(A_j)\) for all \(i\neq j\), i.e., pairwise independence is different from independence.
Sufficient conditions#
A collection is said to be a \(\pi\)-system if it is closed under intersection (if \(A,B\in \mathcal{A}\) then \(A\cap B\in\mathcal{A}\)). There's an independence result for the generated \(\sigma\)-algebras. Recall that \(\sigma(X)\) is the \(\sigma\)-algebra generated by \(X\), which means the smallest \(\sigma\)-field on \(\Omega\) that makes \(X\) a measurable map.
Theorem. Suppose \(\mathcal{A}_i\) for \(i\in\{1,\ldots,n\}\) are independent and each \(\mathcal{A}_i\) is a \(\pi\)-system, then the generated \(\sigma\)-algebras \(\sigma(\mathcal{A_i})\) are independent.
This leads to several corollaries about the sufficient conditions for the independence of random variables.
Theorem. If for all \(x_i\in(-\infty,\infty]\) and random variables \(X_i\) for \(i\in\{1,…,n\}\) we have
\[P(X_1\leq x_1,\ldots,X_n\leq x_n) = \prod_{i=1}^n P(X_i\leq x_i).\]Theorem. Suppose that \((X_1,…,X_n)\) has density \(f(x_1,\ldots,x_n)\), namely, for all \(A\in\mathcal{B}(\mathbb{R})^n\),
\[P((X_1,\ldots,X_n)\in A) = \int_A f(x) dx\]Theorem. Suppose that \(X_i\) for \(i\in\{1,\ldots,n\}\) are random variables that take values in countable setes \(S_i\), then for \(X_i\) to be independent it is sufficient taht whenever \(x_i\in S_i\)
\[P(X_1=x_1,\ldots,X_n=x_n) = \prod_i P(X_i = x_i).\]if \(f(x)\) can be written as \(\prod_{i}g(x_i)\) where all \(g_i\geq 0\) are measurable (\(g_i\) are not assumed to be probability densities), then \(X_i\) are independent.
Functions of disjoint collections of independent random variables are obviously independent.
Distribution and Expectation for Independent Variables#