Math 309 Test 3 Carter Name______

Show all work in order to receive credit.

1. Let . (20 pts)

a)  Find k so that f(x) is a valid probability density function for a random variable X.

b)  Find P( X < 0.3)

c)  Find P( X < 0.3 | X < 1)

d)  Find the mean and variance of X.

2. Emily’s commute to school varies randomly between 22 and 29 minutes. If she leaves at 7:35 a.m. for an 8 a.m. class, what is the probability that she is on time? (7 pts)

3. Use the probability density function to find the cumulative distribution function (cdf) for an exponential random variable with mean . (7 pts)

4. The number of accidents in a factory can be modeled by a Poisson process averaging 2 accidents per week.

a) Find the probability that the time between successive accidents is more than 1 week.

b) Let W denote the waiting time until three accidents occurs. Find the mean, variance, and the probability density function of W.

c) Find probability that it is less than one week until the occurrence of three accidents. (Setting up the integral is sufficient.) (18 pts)

5. The annual rainfall in a certain region is normally distributed with mean 29.5 inches and standard deviation 2.5 inches.

a) Find the probability that the annual rainfall is between 28 and 29 inches.

b) How many inches of rain is exceeded only 1% of the time? (14 pts)

6. The proportion of pure iron in certain ore samples has a beta distribution, with and .

a) Find the probability that one of these samples will have more than 40% pure iron.

b) Find the probability that exactly three out of four samples will have more than 40% pure iron. (14 pts)

7. Select one of the following to prove: (10 pts)

a)  If X has an exponential distribution with , then P(X > a+b | X > a) = P ( X > b).

b)  Let . If , then G(a+1) = aG(a).

c)  Derive the mean of the Beta distribution

d)  Show that where f(x) is the pdf of the gamma distribution.

Answers:
1. k = 0.5; .0225; .09; 4/3, 2/9; 2. 3/7 3.
4a. Let T = time from occurrence of 1 accident until the next accident. T is exponential w/ lambda = 2, .
4b. W is gamma w/ s = 3 and lambda = 2,
5a) P(28 < X < 29) = P( -.6 < Z <-.2) = P(.2 < Z <.6) = .2257 - .0793 = .1464; 2.33 = (x – 29.5)/2.5 => x = 35.325
6a) ; b)

Math 309 Test 4 Carter Name______

Show all work in order to receive credit. 11/29/01

1. Consider a die with three equally likely outcomes. A pair of such dice is rolled. Let X denote the sum on the pair of dice and Y denote the “larger” number. The joint probability distribution is in the chart.

a) P(X £ 4, Y ³ 2)

b) P( X = 4)

c)  P(X ³ 4)

d)  P(Y = 2 | X = 4)

2. Consider the joint density function:

a) Find P(X < ¾, Y < ¼).

b) Find P(X < ¾, Y < ½).

c) Find P(X < ¾ | Y < ½).

d)  Find the conditional density function for X given Y = 1/2.

e)  Find P(X < 3/4 | Y = 1/2 ).

3. Consider the joint density function:

a)  Find the marginal density functions fx(x) and fy(y).

b)  Are X and Y independent? Justify.

c)  Find P( X < Y) . (Set-up is sufficient.)

d)  Set-up an integral that gives E[X-Y].

4. Two friends are to meet at a library. Each arrives randomly at an independently selected time within a fixed one-hour period and agrees to wait no longer than 15 minutes for the other. Find the probability that they will meet. (Set up is sufficient.)

5. Explain the statement, “While covariance measures the direction of the association between two random variables, the correlation coefficient measures the strength of the association." Examples are appropriate in your comments.

6. Select one of the following.

a) For either the discrete or continuous case, show that if X and Y are independent, E[XY] = E[X]E[Y].

b) Show that if X and Y are independent, Cov(X, Y) = 0. (Assume (a) & use the definition, Cov(X,Y) = E[(X-mx) (Y-my)].)

c) Let Y1, Y2, …, Yn be independent random variables with E(YI) = m and V(YI) = s2. Show

E[X] = nm and V(X) = ns2 where X = Y1 + Y2 + … + Yn.


Fall 2011

The tests above do not have anything on moment generating functions or when we created new random variables by adding know random variables (e.g. Jack & Jill’s bowling scores). Study the assigned hw problems!

Chapter 7 – Expectations - Notes.

for a joint discrete distribution.

for a joint continuous distribution.

E[aX + bY] = aE[X] + bE[Y] where E[X] and E[Y] are both finite. (This generalizes to the sum of n random variables.)

Note that if X & Y are independent, Cov(X, Y) = 0

Var(aX + bY) = a2Var(X) + b2 Var(Y) – 2Cov(X, Y)

Note that if X & Y are independent the variance of their sum is the sum of their variances.

Moment Generating Functions (mgf), M(t) –

Know the definition - M(t) = E[etx];

Be able to derive mgf for binomial, Poisson, geometric, uniform continuous, exponential r.v.s;

Know how to find the moments of the r.v. from the mgf;

Know how to find the mean & variance of a r.v. from M(t).

The mgf uniquely defines the distribution – so if a mgf has the form of one of our known mgf, then you can find the associated probabilities, etc.

MX+Y(t) = Mx(t)MY(t)

We used this to show that: the sum of r.v.’s with normal distributions is normal,

the sum of r.v.’s with Poisson distributions is Poisson,

the sum of r.v.’s with gamma distributions is gamma.

However, this is not true for the sums of all distributions. We used the relationship between mgf’s to show that the sum of r.v.’s with a uniform distribution is not uniform.