Page 65 - 4660
P. 65

Properties of joint distributions


               those for single-variable distributions. Thus, the expectation value of any function g(X, Y ) of the
               random variables X and Y is given by

                                          {
                                            ∑ ∑
                                                   g(x i , y j )f(x i , y j )  for the discrete case,
                            E[g(X, Y )] =   ∫  i  ∫  j
                                              ∞   ∞
                                                     g(x, y)f(x, y)dxdy    for the continuous case.
                                             −∞   −∞
               Means. The means of X and Y are defined respectively as the expectation values of the variables
               X and Y. Thus, the mean of X is given by

                                             {
                                               ∑ ∑
                                                       x i f(x i , y j ) for the discrete case,
                              E(X) = µ X =        i   j                                                    (7.7)
                                               ∫
                                                 ∞  ∫  ∞
                                                         xf(x, y)dxdy for the continuous case.
                                                 −∞  −∞
               E(Y ) is obtained in a similar manner.
               Example 7.2. Show that if X and Y are independent random variables then

                                                    E(XY ) = E(X)E(Y ).



               Solution. Let us consider the case where X and Y are continuous random variables. Since X
               and Y are independent f(x, y) = f X (x)f Y (y), so that

                               ∫    ∫                          ∫              ∫
                                 ∞    ∞                          ∞              ∞
                    E(XY ) =             xyf X (x)f Y (y)dxdy =     xf X (x)dx     yf Y (y)dy = E(X)E(Y ).
                                −∞   −∞                         −∞             −∞
               An analogous proof exists for the discrete case.


               Variances. The definitions of the variances of X and Y are analogous to those for the single-
               variable case, i.e. the variance of X is given by
                                          {
                                            ∑ ∑              2
                                                   (x i − µ X ) f(x i , y j )  for the discrete case,
                                     2
                         Var(X) = σ =       ∫  i  ∫  j                                                     (7.8)
                                     X
                                              ∞
                                                              2
                                                  ∞
                                                     (x − µ X ) f(x, y)dxdy   for the continuous case.
                                             −∞   −∞
               Equivalent definitions exist for the variance of Y.
               Covariance and correlation. Means and variances of joint distributions provide useful
               information about their marginal distributions, but we have not yet given any indication of how
               to measure the relationship between the two random variables. Of course, it may be that the two
               random variables are independent, but often this is not so. For example, if we measure the
               heights and weights of a sample of people we would not be surprised to find a tendency for tall
               people to be heavier than short people and vice versa. We will show in this section that two
               functions, the covariance and the correlation, can be defined for a bivariate distribution and
               that these are useful in characterizing the relationship between the two random variables.
                   The covariance of two random variables X and Y is defined by

                                             Cov[X, Y ] = E[(X − µ X )(Y − µ Y )],                         (7.9)

               where µ X and µ Y are the expectation values of X and Y respectively. Clearly related to the
               covariance is the correlation of the two random variables, defined by

                                                                  Cov[X, Y ]
                                                   Corr[X, Y ] =                                          (7.10)
                                                                    σ X σ Y

                                                              65
   60   61   62   63   64   65   66   67   68   69   70