Page 72 - 4660
P. 72
The Law of Large Numbers
PROOF. Since X accepts only non-negative values, then E(X) ≥ α and
∫ ∫ ∫
+∞ ∞ ∞
E(X) = xf(x)dx ≥ xf(x)dx ≥ α f(x)dx = αP(X ≥ α).
0 α α
E(X)
Therefore P(X ≥ α) ≤ .
α
E(X)
Corollary 8.1. P(X < α) > 1 − . 2
α
Really, since the events X ≥ α and X < α are complementary, then P(X ≥ α) + P(X <
E(x)
α) = 1, hence P(X < α) = 1 − P(X ≥ α) > 1 − .
α
Theorem 8.2. (Chebyshev’s inequality.) If a random variable X has a finite variance, then
for any ε > 0 the following inequality holds:
Var(X)
P(|X − E(X)| ≥ ε) ≤ . (8.2)
ε 2
⋆
2
2
PROOF. On applying Markov’s inequality to the random variable (X − E(X)) and taking α = ε we
get
E(X − E(X)) 2 Var(X)
2 2
P((X − E(X)) ≥ ε ) ≤ = ,
ε 2 ε 2
2
2
since the inequality (X − E(X)) ≥ ε is equivalent to the inequality |X − E(X)| ≥ ε. 2
Var(X)
Corollary 8.2. P(|X − E(X)| < ε) < 1 − 2 . 2
ε
Since the events |X − E(X)| ≥ ε and |X − E(X)| < ε are complementary we have P(|X −
E(X)| ≥ ε)+P(|X −E(X)| < ε) = 1. Whence P(|X −E(X)| < ε) = 1−P(|X −E(X)| ≥ ε) =
Var(X)
1 − 2 . Now we consider some special cases of Chebyshev’s inequality. Let p be a probability
α
of some event A in n repeated independent trials, m is a frequency of the event A; m is relative
n
frequency.
1. For a random variable X = m having a binomial law of distribution we have
E(X) = M(m) = np, Var(X) = Var(m) = npq,
npq
P(|m − np| < ε) > 1 − . (8.3)
ε 2
m
2. For a random variable X = m having a binomial law of distribution we get E(X) = M( ),
n
n
m
Var(X) = Var( ) = pq :
n n
m
( ) pq
− p < ε > 1 − . (8.4)
P
n nε 2
The Law of Large Numbers
Theorem 8.3. (Chebyshev’s theorem) If X 1 , X 2 , . . . , X n , . . . is a sequence of random
variables, pairwise independent with means a 1 , a 2 , . . . , a n , . . . whose variances are bounded
by the constant constant Var(X k ) ≤ C, k = 1, 2, . . . , then for any constant ε > 0
( )
X 1 + X 2 + . . . + X n a 1 + a 2 + . . . + a n
lim P − ≤ ε = 1 (8.5)
n n
⋆
n→∞
72