1

STATISTICS: MODULE 12122

Chapter 3 - Bivariate or joint probability distributions

In this chapter we consider the distribution of two random variables where both

random variables are discrete (considered first) and probably more importantly where

both random variables are continuous. Bivariate or joint distributions model the way

two random variables vary together.

A. DISCRETE VARIABLES

Example 3.1

Here we have a probability model of the demand and supply of a perishable

commodity. The probability model/distribution is defined as follows:

Supply of commodity (SP)
1 / 2 / 3
0 / 0.015 / 0.025 / 0.010
Demand for / 1 / 0.045 / 0.075 / 0.030
commodity / 2 / 0.195 / 0.325 / 0.130
(D) / 3 / 0.030 / 0.050 / 0.020
4 / 0.015 / 0.025 / 0.010

This is known as a discretebivariate or joint probability distribution since there are

two random variables which are "demand for commodity (D)" and "supply of

commodity (SP)".

The sample space S consists of 15 outcomes (d, s) where d and s are the values of D and SP.

The probabilities in the table are joint probabilities, namely P( D = dandSP = s) or

P( D = dnSP = s) using set notation.

Examples

Note: The sum of the 15 probabilities is 1.

3.2 Joint probability function

Suppose the random variables are X and Y , then the joint probability function is

denoted by p( x, y) and is defined as follows:

p( x, y) = P( X = xandY = y) or P( X = xnY = y)

Also = 1.

3.3 Marginal probability distributions

The marginal distributions are the distributions of X and Y considered separately

and model how X and Y vary separately from each other. Suppose the probability

functions of X and Y are and respectively so that

= P(X = x) and = P(Y = y)

Also and .

It is quite straightforward to obtain these these from the joint probability distribution.

Example 3.2 The joint distribution of X and Y is

X
-2 / -1 / 0 / 1 / 2
Y / 10 / 0.09 / 0.15 / 0.27 / 0.25 / 0.04
20 / 0.01 / 0.05 / 0.08 / 0.05 / 0.01

Find the marginal distributions of X and Y.

3.4 Conditional probability distributions

In regression problems we are very interested in conditional probability distributions

such as the conditional distribution of X given Y = y and the conditional distribution

of Y given X = x.

The conditional probability function of X given Y = y is denoted by

is defined as

= = =

whereas the conditional probability function of Y given X = x is denoted by

and defined as

= = =

Example 3.3 For the distribution given in Example 3.2, find the conditional

distribution of X given Y =20.

3.5 Joint probability distribution function

The joint (cumulative) probability distribution function (c.d.f.) is denoted by F(x, y)

and is defined as

F(x, y) = P( X=xandY=y) and 0 =F(x, y) = 1

The marginal c.d.f ’s are denoted by and and are defined as follows

= P( X=x) and = P( Y=y) (see Chapter 1)

3.6 Are X and Y independent?

If either (a) F(x, y) = .

or (b) p(x, y) = .

then X and Y are independent random variables.

Example 3.4 For the distribution given in Example 3.2 determine if X and Y

are independent.

B. CONTINUOUS VARIABLES

3.7 Joint probability density function

The joint p.d.f. is denoted by f (x, y) (where all x and y) and defines

a probability surface in 3 dimensions. Probability is a volume under this surface and

the total volume under the p.d.f. surface is 1 as the total probability is 1 i.e.

and P(a=X =bandc=Y=d ) =

As before with discrete variables, the marginal distributions are the distributions of

X and Y considered separately and model how X and Y vary separately from each

other.

Whereas with discrete random variables we speak of marginal probability functions,

with continuous random variables we speak of marginal probability density

functions.

3.8 Marginal probability density function

The marginal p.d.f of X is defined as and is the equation of a curve called

the p.d.f. curve of X . P(a=X =b) is an area under the p.d.f. curve and so

P(aX b) = (as in Chapter 1).

It can be obtained from the joint p.d.f. by a single integration, as follows:

=

The marginal p.d.f of Y is defined as and is the equation of a curve called

the p.d.f. curve of Y . P(c=Y =d) is an area under the p.d.f. curve and so

P(cY d) = .

It can be obtained from the joint p.d.f. by a single integration, as follows:

and =

3.9 Joint probability distribution function

As in 3.5 the joint (cumulative) probability distribution function (c.d.f.) is denoted by

F(x, y) and is defined as F(x, y) = P( X=xandY=y).

The joint c.d.f. F(x, y) in the continous case is the volume under the p.d.f. surface from

to X = x and from to Y = y, so that

so f (x, y) =

The marginalc.d.f. ‘s are defined as in 3.5 and can be obtained from the joint

distribution function F(x, y) as follows:

= = where is the largest value of y

= = where is the largest value of x.

Example 3.5 The joint distribution function of X and Y is given by

= 0 otherwise

Find the joint density function and the marginal distribution functions of X and Y.

3.10 Important connections between the p.d.f ‘s and the joint c.d.f.’s.

(i) The joinf p.d.f. f (x, y) = where is the joint c.d.f.

i

(ii) Not only can the marginal p.d.f’s be obtained from the joint p.d.f.

(see section 3.8 ) but also from the marginal c.d.f.’s as follows:

the marginal p.d.f. of X = = or ,

the marginal p.d.f. of Y = = or

Example 3.6 Find the marginal p.d.f’s of X and Y for the distribution given in

Example 3.5.

Given:Given:

Joint density fn. f (x ,y)Joint distribution fn. F(x, y)

Integrate twiceDifferentiate (partially)

 twice



Joint distribution fn. F(x, y)Joint density fn. f (x, y)

(cross-partial derivative)

3.11 Conditional probability density functions

The conditional p.d.f of X given Y = y is denoted by and

defined as =

whereasthe conditional p.d.f of Y given X = x is denoted by and

defined as =

3.12 Are X and Y independent?

X and Y are independent random variables if either

(a)F(x, y) = ; or

(b) f(x, y) = ; or

(c) = function of x only or equivalently = function of y only

Example 3.7 X and Y have the joint probability density function

(a) Derive the marginal distribution function of X.

(b) Derive the conditional density function of X given Y = y

(see worked example 3.7(b)).

(c) Are X and Y independent?

(see worked example 3.7(c))

Workedexample 3.7 (b) and (c)

Solution (b) From section 3.11 the conditional p.d.f of X given Y = y is defined as

= where is the joint p.d.f. of X and Y

and is the marginal p.d.f. of Y.

We are given so we need to find .

From section 3.8 =

so = = =

= = =

Now therefore the conditional density function of X given Y = y , is given by

== =

So = 1=x= 2 and 1=y= 2

= 0 otherwise

(c) Now is a function of xonly, so using result 3.12(c), X and Y

are independent. Notice also that = which you would expect if X and

Y were independent.

Other examples of joint or bivariate distributions

Example 3.8

An electronics system has one of each of two different types of components in joint

operation. Let X and Y denote the random lengths of life of the components of type 1

and 2, respectively. Their joint density function is given by

Example 3.9 The bivariate Normal distribution

The random variables X and Y have a bivariate normal distribution if

where and

where ,

The p.d.f. surface is shown below and as you can see is bell-shaped.

3.13 Expectations and variances

Discrete random variables

= = r =1,2.....

= = r =1,2....

Examples

Hence Var(X) = , Var(Y) = etc.

Continuous random variables

r = 1, 2,…….

r = 1, 2,…….

Examples

3.14 Expectation of a function of the r.v.'s X and Y

Continuous X and Y Discrete X and Y

E[XY]

3.15 Covariance and correlation

Covariance of X and Y is defined as follows :

Cov (X,Y) = =

An alternative definition is

Cov (X,Y) = where and .

Notes

(a) If the random variables increase together or decrease together, then the covariance

will be positive, whereas if one random variable increases and the other variable

decreases and vice-versa, then the covariance will be negative.

(b) If X and Y are independent random variables, then E(XY) = E(X)E(Y)

so cov(X, Y) = 0.

However if cov(X,Y) = 0, it does not follow that X and Y are independent unless

X and Y are normal random variables.

(c) Cov( X , X) =

Coefficient of correlationbetween X and Y= = corr(X , Y)

Notes

(a) The correlation coefficient is a number between -1 and 1 i.e. -1 == 1

(b) If the random variables increase together or decrease together, then

will be positive, whereas if one random variable increases and the other variable

decreases and vice-versa, then will be negative.

(c) It measures the degree of linear relationship between the two random variables X

and Y , so if there is a non-linear relationship between X and Yor X and Y are

independent random variables, then will be 0.

Example 3.10 In Example 3.2 are X and Y correlated?

Solution Below is the joint or bivariate probability distribution of X and Y:

X
-2 / -1 / 0 / 1 / 2
Y / 10 / 0.09 / 0.15 / 0.27 / 0.25 / 0.04
20 / 0.01 / 0.05 / 0.08 / 0.05 / 0.01

From Example3.2(a) the marginal distributions of X and Y are

x / -2 / -1 / 0 / 1 / 2 / Total
P(X =x)or / 0.10 / 0.20 / 0.35 / 0.30 / 0.05 / 1.00
y / 10 / 20 / Total
P(Y = y) or / 0.8 / 0.20 / 1.00

Example 3. 11 In Example 3.7 calculate E(X), Var(X), E(Y) and cov(X,Y).

(see Worked Example 3.11)

3.16 Useful results on expectations and variances

(i) where a and b are constants.

(ii)

Result (i) can be extended to any n random variables

When X and Y are independent, then

(iii)

(iv) so

Results (iii) and (iv) can be extended to any nindependent random variables

(iii)

(iv)

(v) If and where are constants

then

3.17 Conditional means or expectations

Conditional means are very important in regression analysis and so very important

in Econometrics. They are the means of the conditional distributions of X and Y.

The mean of the conditional distribution of Y given X = x is denoted by

The mean of the conditional distribution of X given Y = y is denoted by

.

From section 3.11 , the conditional p.d.f of X given Y = y is denoted by and

defined as =

whereasthe conditional p.d.f of Y given X = x is denoted by and

defined as =

The conditional means are defined as follows

=

=

An example from regression analysis

The regression curve of Y on X is given by and the regression curve of

X on Y is given by .

Example3.12 Find the regression curve of Y on X given the joint p.d.f. f (x , y) where

Solution The regression curve of Y on X is given by =

where , the conditional density function of Y given X = x is given by

Here is the marginal density function of X and we need to evaluate this.

Integrating out to get the marginal density function of X gives

In the integration which is with respect to y, x is treated as a constant. Call it c say.

So = =

So

Hence the conditional density of Y given X = x is given by

= for y > 0

= 0 otherwise

Hence = =

Again in the integration with respect to y, x is treated as a constant. Call it c again.

= .

To evaluate this we must integrate by parts. From section 1.14, Chapter 1,QM1, we have

.

Here we are integrating with respect to y so this becomes

The idea in integration by parts is to choose the u and v carefully so that the integral

on the right hand side of the above equation is simpler than the integral on the left hand

side of the above equation.

Choose so that and choose so that integrating

gives .

So = =

Hence the regression curve of Y on X has equation = which is the equation

of a rectangular hyperbola (see Fig 1 below).

The regression curve of X on Y is given by =

where , the conditional density function of X given Y = y is given by

where is the marginal density function of Y.

It can be shown that = .

The regression curve of X on Y is shown below in Figure 2.

Important result for conditional means

and

Reading for Chapter 3

Relevant parts of section 2.6 and 2.7

Essentials of Econometrics by D. Gujarati (see module outline for publishers details etc.

Worked Example 3.11

In Example 3.7 for and

= 0 otherwise.

From Example 3. 7 (a) for

for

or 1.6071

where

or 2.657

Hence = 0.0742

= where

.

However you do not have to work this out, because from Example 3.7(c), we have that

X and Y are independent, hence .

If you feel like getting practice in double integration, you can verify this.

Hence , 0.0742 , and .

C. Osborne February 2002