Page 87 - 4660
P. 87

Some basic estimators


               where P stands for the population P(x|a) and b is the bias of the estimator. Denoting the quantity
               on the RHS of (2.4) by V min , the efficiency e of an estimator is defined as

                                                       e = V min / Var(ˆa).

               An estimator for which e = 1 is called a minimum-variance or efficient estimator. Otherwise, if
               e < 1, ˆa is called an inefficient estimator.
                   It should be noted that, in general, there is no unique ’optimal’ estimator ˆa for a particular
               propertya. Tosomeextent, thereisalwaysatrade-offbetweenbiasandefficiency. Onemustoften
               weightherelativemeritsofanunbiased, inefficientestimatoragainstanotherthatismoreefficient
               but slightly biased. Nevertheless, a common choice is thebest unbiased estimator (BUE), which
               is simply the unbiased estimator ˆa having the smallest variance Var(ˆa).
                   Finally, we note that some qualities of estimators are related. For example, suppose ˆa is an
               unbiased estimator, so that E(ˆa) = a and Var(ˆa) → 0 as N → ∞. Using the Bienaymé-Chebyshev
               inequality, it follows immediately that ˆa is also a consistent estimator. Nevertheless, it does not
               follow that a consistent estimator is unbiased.

               Example 2.2. The sample values x 1 , x 2 , . . ., x N are drawn independently from a
               Gaussian distribution with mean µ and variance σ. Show that the sample mean x is
               a consistent, unbiased, minimum-variance estimator of µ.                                       ,

               Solution. We found earlier that the sampling distribution of x is given by

                                                                      [           ]
                                                            1             (¯x − µ) 2
                                          P(¯x|µ, σ) = √           exp −     2      ,
                                                              2
                                                          2πσ /N           2σ /N
                                                                                  2
               from which we see immediately that E(¯x) = µ and Var(¯x) = σ /N. Thus ¯x is an unbiased
               estimator of µ. Moreover, since it is also true that Var(¯x) → 0 as N → ∞, x is a consistent
               estimator of µ.
                   In order to determine whether ¯x is a minimum-variance estimator of µ, we must use Fisher’s
               inequality (2.4). Since the sample values x i are independent and drawn from a Gaussian of
               mean µ and standard deviation σ, we have

                                                            N [                     2  ]
                                                         1  ∑          2    (x i − µ)
                                        ln P(x|µ, σ) = −        ln(2πσ ) +             ,
                                                         2                     σ 2
                                                           i=1
               and, on differentiating twice with respect to µ, we find

                                                         2
                                                        ∂ ln P      N
                                                                = −    .
                                                         ∂µ 2       σ 2
                                                                                                   2
               This is independent of the x i and so its expectation value is also equal to −N/σ . With b set
               equal to zero in (2.4), Fisher’s inequality thus states that, for any unbiased estimator ˆu of the
               population mean,
                                                                   σ 2
                                                         Var(ˆµ) ≥    .
                                                                   N
                                 2
               Since Var(¯x) = σ /N, the sample mean ¯x is a minimum-variance estimator of µ.


                     Some basic estimators


               In many cases, one does not know the functional form of the population from which a sample is
               drawn.    Nevertheless, in a case where the sample values x 1 , x 2 , . . . , x N are each drawn


                                                              87
   82   83   84   85   86   87   88   89   90   91   92