5

STAT 211

Handout 3 (Chapter 3): Discrete Random Variables

Random variable (r.v.): A random variable is a function from sample space, S to the real line. That is, a random variable assigns a real number to each element of S.

Discrete random variable: Possible values of isolated points along the number line. Random variables have their own sample space, or set of possible values. If this set is finite or countable, the r.v. is said to be discrete.

Example 1: Define the numeric numbers random variable can take and their probabilities. Use this example to demonstrate most of the properties and tools in this handout.

(i)  Consider an experiment in which each of three cars come to a complete stop (C) on the intersection or not (N). Let random variable X be the number of cars come to a complete stop.

(ii)  Consider an experiment of four home mortgages that are classified as fixed rate (F) or variable rate (V). Let random variable Y be the number of homes with the fixed mortgage rate.

Discrete probability distribution: A probability distribution describes the possible values and their probability of occurring. Discrete probability distribution is called probability mass function (pmf), p(.) and need to satisfy following conditions.

·  0 £ p(x)=P(X=x) £ 1 for all x where X is a discrete r.v.

· 

Examples: Discrete Uniform, Bernoulli, Binomial, Hypergeometric, Negative Binomial, Geometric Distributions.

Example 1(i) (continue): Is p(x) a legitimate probability mass function (pmf) of x?

Example 1(ii) (continue):

y p(y)

0 1/16

1 4/16

2 6/16

3 4/16

4 1/16

·  All probabilities, p(y) are between 0 and 1

·  When you sum probabilities for all possible y values, they add up to 1.

p(y) is a legitimate probability mass function (pmf) of y.

Example 2: A pizza shop sells pizzas in four different sizes. The 1000 most recent orders for a single pizza gave the following proportions for the various sizes.

Size / 12" / 14" / 16" / 18"
Proportion / 0.20 / 0.25 / 0.50 / 0.05

With X denoting the size of a pizza in a single-pizza order, Is the table above a valid pmf of x?

Example 3: Could p(x)=x2/50 for x=1,2,3,4,5 be the pmf of x? If it is not, is it possible to find the pmf of x?

Cumulative Distribution Function, (CDF):

F(-¥)=0, F(¥)=1 and where a and b are integers.

Probabilistic properties of a discrete random variable, X:

P(X £ a) + P(X > a) = 1 where a is a constant integer. Then P(X > a) = 1 - P(X £ a)

P(X ³ a) = 1 - P(X < a) = 1- P(X £ a-1) = 1- F(a-1)

P(a < X < b) = P(X < b) - P(X £ a) = P(X £ b-1) - P(X £ a) = F(b-1) – F(a) where b is also a constant integer.

P(a £ X £ b) = P(X £ b) - P(X < a) = P(X £ b) - P(X £ a-1) = F(b) – F(a-1)

Example 1 (i)(continue): For the cars come to a complete stop example, let’s write the cumulative distribution function of X.

Example 1 (ii)(continue): For the home mortgages example, the cumulative distribution function of Y is

F(0-)=P(Y<0)=0

F(0)=P(Y£0)=p(0)=1/16

F(1)=P(Y£1)=p(0)+p(1)=F(0)+p(1)=5/16

F(2)= P(Y£2)=p(0)+p(1)+p(2)=F(1)+p(2)=11/16

F(3)= P(Y£3)=p(0)+p(1)+p(2)+p(3)=F(2)+p(3)=15/16

F(4+)= P(Y£4)= p(0)+p(1)+p(2)+p(3)+p(4)=F(3)+p(4)=16/16=1

P(Y>2)=p(3)+p(4) or 1-P(Y≤2)=1-F(2)=1-p(0)-p(1)-p(2)=5/16

P(Y≥2)=p(2)+p(3)+p(4)=11/16 or 1-P(Y<2)=1-F(1)=1-p(0)-p(1)=11/16

P(1<Y<3)=p(2) or P(Y<3)-P(Y≤1)= F(2)-F(1)= (p(0)+p(1)+p(2))-(p(0)+p(1))= 6/16

P(1≤Y≤3)=p(1)+p(2)+p(3) or P(Y≤3)-P(Y<1)= F(3)-F(0)

= (p(0)+p(1)+p(2)+p(3))-(p(0))=14/16

Example 2 (continue): For the pizza example, the cumulative distribution function of X is

F(12-)=P(X<12)=0

F(12)=P(X£12)=0.20

F(14)=P(X£14)=0.45

F(16)= P(X£16)=0.95

F(18+)= P(X£18)=1

=0.95-0.20=0.75 or P(X=14)+P(X=16)=0.75

The expected value of a random variable, X : m=E(X) (the population distribution of X is centered).

Expected value for the discrete random variable, X: Weighted average of the possible values. Expected value of the random variable X,

Rules of expected value:

(i)  For any constant a and the random variable X, E(a×X) = a×E(X)

(ii)  For constant b, E(b) = b

(iii)  For any constant a and b, the random variable X, E(a×X±b) = a×E(X)±b

Example 1(i)(continue): If we use the cars come to a complete stop example, determine the expected number of cars come to a complete stop.

Example 1(ii)(continue): If we use the home mortgages example, determine the expected number of homes with the fixed mortgage rate.

=32/16=2

On the average, 2 houses expected to have fixed mortgage rated.

Example 2(continue): If we use the pizza example, show that the expected size of a pizza is approximately 14.8".

=14.8

On the average, 14.8" pizza is expected to be ordered.

If the company want to sell the twice the size of the pizzas, we can write the random variable Y as 2X, then the pmf of Y is

y 24 28 32 36

p(y) 0.20 0.25 0.50 0.05

=2=29.6

What is the approximate probability that X is within 2" of its mean value?

P(12.8 ≤ X ≤ 16.8)=P(X=14)+P(X=16)=0.25+0.50=0.75

The variance of random variable, X: Measure of dispersion. Variance of the random variable X, s2 = Var(X) = (variability in the population distribution of X)

The standard deviation of random variable, X: s =

Variance for the discrete random variable, X: or the suggested shortcut is where .

If h(X) is the function of random variable X,

Var(h(X))= .

If h(X) is a linear function of X, the rules of the mean and the variance can directly be used instead of going through the mathematics.

Rules of variance:

(i)  For any constant, a and the random variable, X, Var(a×X) = a2×Var(X)

(ii)  For constant b, Var(b) = 0

(iii)  For constants a and b and the random variables X, Var(a×X ± b) = a2×Var(X)

Example 1 (i) (continue): If we use the cars come to a complete stop example, determine the variance in the number of cars come to a complete stop.

Example 1 (ii) (continue): If we use the home mortgages example, what is the variance of Y?

=1

Example 2 (continue): If we use the pizza example, what is the variance of X?

=2.96

and the standard deviation of X is ==1.72

If we have a new variable Y=2X, the pmf of Y is

y 24 28 32 36

p(y) 0.20 0.25 0.50 0.05

=11.84 = 4

Parameter: If P(X=x) depends on a quantity that can be assigned any one of a number of possible values, with each different value determining a different probability distribution, that quantity is called a parameter of the distribution.

Bernoulli Distribution: It is based on Bernoulli trial (an experiment with two, and only two, possible outcomes). A r.v. X has a Bernoulli(p) distribution where p is the parameter if

X=

0£p£1.

P(X=x)=

Examples 4: (i) Flip a coin 1 time. Let X be the number of tails observed. Let P(heads)=0.55 then P(tails)=0.45, In this example p is P(tails)=0.45.

(ii) A single battery is tested for the viability of its charge. Let X be 1 if battery is OK and zero otherwise. Let P(battery is OK)=0.90 then P(battery is not OK)=0.10, In this example p is P(Battery is OK)=0.9.

Binomial Distribution: Approximate probability model for sampling without replacement from a finite dichotomous population. X~Binomial(n,p).

·  n fixed trials

·  each trial is identical and results in success or failure

·  independent trials

·  the probability of success (p) is constant from trial to trial

·  X is the number of successes among n trials

E(X) = n×p and Var(X) = n×p×(1-p)

Binomial Theorem: For any real numbers x and y and integer n ³ 0,

Cumulative distribution function:

Table A.1 demonstrates cumulative distribution function values for n=5,10,15,20,25 with different p values.

Example 5: A lopsided coin has a 70% chance of "head". It is tossed 20 times. Suppose

X: number of heads observed in 20 tosses ~ Binomial (n=20, p=0.70)

Y: number of tails observed in 20 tosses ~ Binomial (n=20, p=0.30)

Determine the following probabilities for the possible results:

a.  at least 10 heads

P(X≥10)=1-P(X<10)=1-P(X≤9)=1-0.017

P(Y≤10)=0.983

b.  at most 13 heads

P(X≤13)=0.392

P(Y≥7)=1-P(Y<7)=1-P(Y≤6)=1-0.608

c.  exactly 12 heads

P(X=12)=P(X≤12)-P(X≤11)=0.228-0.113

P(Y=8)= P(Y≤8)-P(Y≤7)=0.887-0.772

d.  between 8 and 14 heads (inclusive)

P(8≤X≤14)= P(X≤14)-P(X≤7)=0.584-0.001

P(6≤Y≤12)= P(Y≤12)-P(Y≤5)=0.999-0.416

e.  fewer than 9 heads

P(X<9)=P(X≤8)=0.005

P(Y>11)=1-P(Y≤11)=1-0.995

Hypergeometric Distribution: Exact probability model for the number of successes in the sample.

X~Hyper(M,N,n)

Let X be the number of successes in a random sample of size n drawn from a population with size N consisting of M successes and (N-M) failures.

E(X) = where is the proportion of successes in the population.

Var(X) = where is the finite population correction factor.

Example 6: An urn filled with 9 balls that are identical in every way except that 3 are red and 6 are green. We reach in and select 2 balls at random (2 balls are taken all at once, a case of sampling without replacement). What is the probability that exactly x of the balls are red?

: Total number of samples of size 2 that can be drawn from the 9 balls.

: Number of ways that x balls will be red out of 3 red balls.

: Number of ways that remaining 2-x balls will be green..

X: number of red balls drawn from a sample of n balls has hypergeometric distribution and the answer is P(X=x) where x=0,1,2

What is the expected number of red balls?

What is the variance in the number of red balls?

Example 7: A quality-control inspector accepts shipments whenever a sample of size 5 contains no defectives, and she rejects otherwise.

a.  Determine the probability that she will accept a poor shipment of 50 items in which 20% are defective.

Let X be the number of defective items in the sample of n=5 for the poor shipment N=50 items. M=50(0.20)=10 defective items in the poor shipment

P(accept shipment)=P(X=0)==0.3106

b.  Determine the probability that she will reject the good shipment of 100 items in which 2% are defective.

Let X be the number of defective items in the sample of n=5 for a good shipment N=100 items. M=100(0.02)=2 defective items in the good shipment

P(reject shipment)=P(X³1)=P(X=1)+P(X=2)= =0.098

Negative Binomial Distribution: Binomial Distribution counts the number of successes in a fixed number of Bernoulli trials. Negative Binomial Distribution counts the number of Bernoulli trials required to get a fixed number of successes.

X ~ NegativeBinomial(r,p)

X: number of failures before the rth success

p: probability of successes

r: number of successes

E(X) = and Var(X) =

Example 8 (Exercise 3-71, 6th edition which is Exercise 3-69, 5th edition ):

P(male birth)=0.5

A couple wished to have exactly 2 female children in their family. They will have children until this condition is fulfilled.

(a) What is the probability that the family has x male children?

X: number of male children until they have 2 girls

p=P(female)=0.5 r:number of girls=2

X~Negative binomial(r=2,p=0.5)

P(X=x)=, x=0,1,2,3,……

(b) What is the probability of family has four children? (Answer=0.1875)

(c) What is the probability that the family has at most 4 children? (Answer=0.6875)

(d) How many male children would you expect this family to have? (Answer=2)

How many children would you expect this family to have? (Answer=4)

The Geometric Distribution is the simplest of the waiting time distributions and is a special case of the negative binomial distribution (r=1).

p: probability of success

X: the trial at which the first success occurs (waiting time for a success)

E(X) = , Var(X) = and P(X >x)=(1-p)x

Need to remember that and

Example 9: A series of experiments conducted in order to reduce the proportion of cells being scrapped by a battery plant because of internal shorts. The experiment was successful in reducing the percentage of manufactured cells with internal shorts to around 1%. Suppose we are interested in the number of the test at which the first short is discovered. Find the probability that at least 50 cells are tested without finding a short.

X : the number of tests until the first short ~ Geometric(p)

p : probability of internal shorts=0.01

P(X>50) = (1-p)x = (1-0.01)50 =0.605

Poisson Distribution: waiting time for an occurrence (waiting time for a bus, waiting time for customers to arrive in a bank, etc.). The probability of an arrival is proportional to the length of waiting time.

l: rate per unit time or per unit area

X : number of occurrences in a given time period or place (example: # of parts produced/hour, or # of fractures /blade, and so on.)

Note that and E(X)=Var(X)=l.

Cumulative distribution function:

Table A.2 demonstrates cumulative distribution function values with different l values.

Example 10: The only bank teller on duty at a local bank needs to run out for 10 minutes but he does not want to miss any customers. There are usually 2 customers per hour coming to the bank.

(a)  What is the probability that no one will arrive in the next 10 minutes? (Answer=0.7165)

(b)  What is the probability that two or more people will arrive in the next 10 minutes? (Answer=0.0446)

Example 11: Transmission line interruptions in a telecommunications network occur at an average rate 1 per day.

Let X be the number of line interruptions in t days

E(X)=l=1(t)=t interruptions in t days

Find the probability that the line experiences

a.  no interruptions in 5 days

P(X=0)=