Page 73 - 4660
P. 73
Central limit theorem
PROOF. Consider the random variable X n = X 1 +X 2 +...+X n . Since X k (k = 1, 2, . . . , n) are
n
independent random variables then by the corresponding properties of mean and variance we have
( )
X 1 + X 2 + . . . + X n a 1 + a 2 + . . . + a n
E(X n ) = E = ;
n n
1
Var(X n ) = (Var(X 1 ) + Var(X 2 ) + . . . + Var(X n )).
n 2
Apply Chebyshev’s inequality to the random variable X n to obtain
( )
X 1 + X 2 + . . . + X n a 1 + a 2 + . . . + a n Var(X 1 ) + Var(X 2 ) + . . . + Var(X n )
P − ≤ε ≥1 − .
2 2
n n n ε
Since Var(X k ) ≤ C (k = 1, 2, . . . , n) then
( )
X 1 + X 2 + . . . + X n a 1 + a 2 + . . . + a n C
P − ≤ ε ≥ 1 − ,
n n n ε
2 2
( )
C
lim 1 − = 1.
2 2
n→∞ n ε
The theorem has been proved. 2
Theorem 8.4. (Bernoulli’s theorem) Let p be a probability of some event A in n repeated
independent trials, m is a frequency of the event A, then for any constant ε > 0,
m
( )
lim P − p < ε = 1. (8.6)
n→∞ n
⋆
PROOF. Passing in the inequality (8.4) to a limit as n → ∞ we arrive at the formula (8.6). 2
Central limit theorem
Theorem 8.5. (Lyapunov’s central limit theorem) If X 1 , X 2 , . . . , X n are independent
random variables with mean a k (k = 1, 2, ..., n) and also |X k − a k | ≤ δ (k = 1, 2, . . . , n), and
variances are bounded by one and the same number, that is Var(X k ) ≤ C, (k = 1, 2, . . . , n).
∑ n
Then for n → ∞ the sum X k infinitely approaches the normal distribution with mean
∑ ∑ k=1
n a k and variance n 2 ⋆
k=1 k=1 σ .
k
This theorem we accept without proof.
Corollary 8.3. If random variables X k (k = 1, 2, . . . , n) are equally distributed then the
law of distribution of their sum as n → ∞ approaches the normal law of distribution. 2
Corollary 8.4. If X 1 , X 2 , . . . , X n satisfy conditions of central limit theorem, then applying
∑ n
the formula (6.23) to their sum X k we obtain the approximate formula
k=1
( )
n ( ) ( )
∑ β − α α − σ
P α ≤ X k ≤ β ≈ Φ − Φ . (8.7)
σ σ
k=1 2
73