JOINT PROBABILITY DISTRIBUTION
Let x and y be two different discrete random variables.
f(x, y) - joint probability distribution of x and y
-probability distribution of the simultaneous occurrence of x and y; i.e., f(x, y) = P(X = x, Y = y)
-gives the probability distribution that outcomes x and y can occur at the same time
-For example,
Let x - age to the nearest year of a TV set that is to be repaired
y - number of defective tubes in the set
f(x, y) = f(5, 3) = probability that the TV set is 5 years old and needs 3
new tubes
Characteristics of a Joint Probability Distribution
- f(x, y) 0 for all (x, y)
- f(x, y) = 1add up the probabilities of all possible combinations of x
and y within the range
- f(x, y) = P(X = x, Y = y)
- For any region A in the x y plane, P [(x, y) A] = f(x, y)
Example 1:
Two refills for a ballpoint pen are selected at random from a box containing 3 blue refills, 2 red refills and 3 green refills. If X is the number of blue refills and Y is the number of red refills selected, find
- the joint probability distribution function f(x, y)
- P [(X, Y) A] , where A is the region { (x, y) x + y 1 }
JOINT DENSITY FUNCTION
Joint Density Function – joint distribution of continuous random variables
Characteristics of a Joint Density Function
- f(x, y) 0
- f(x, y) dx dy = 1
- P [ (X, Y) A] = f(x, y) dx dyfor any region A in the x y plane
Note:f(x, y) - surface lying above the x y plane
Probability - volume of the right cylinder bounded by the base A and the surface
Example 2:
A candy company distributes boxes of chocolates with a mixture of creams, toffees and nuts coated in both light and dark chocolate. For a randomly selected box, let X and Y, respectively be the proportion of the light and dark chocolates that are creams and suppose that the joint density function is given by:
f(x, y)=k(2x + 3y)0 x 1,0 y 1
0elsewhere
Find P [ (X, Y) A] where A is the region { (x, y) 0 < x < ½ , ¼ < y < ½ }
NOTE:For the discrete case,P(X = x, Y = y) = f(x, y)
ex.P(x = 2, y = 1) = f(2, 1)
For the continuous case, P(X = x, Y = y) f(x, y)
MARGINAL DISTRIBUTIONS
Given the joint probability distribution f(x, y) of the discrete random variable X and Y, the probability distribution g(x) of X along is obtained by summing f(x, y) over the values of y. Similarly, the probability distribution h(y) of Y alone is obtained by summing f(x, y) over the values of x. g(x) and h(y) are defined to be the marginal distributions of x and y respectively.
g(x) = f(x, y)h(y) = f(x, y)for the discrete case
g(x) = f(x, y) dyh(y) = f(x, y) dx for the continuous case
Example 3:
Derive g(x) and h(y) for Example 1.
Example 4:
Derive g(x) and h(y) for the joint density function in Example 2.
CONDITIONAL DISTRIBUTIONS
Recall:Conditional Probability Formula
P ( B / A) = P(A B)
P(A)
Consider 2 random variables X and Y:
If we let A be the event defined by X = x and B be the event that Y = y, we have,
P ( Y= y) / X = x ) = P (X = x, Y = y)
P (X = x)
= f(x, y)
g(x)g(x) > 0
where X and Y are discrete random variables
P (Y = y / X = x ) may actually be expressed as a probability distribution denoted by f( y / x). Therefore, f (y / x) is called by conditional distribution of the random variable Y given that X = x.
Generalization
Let X and Y be two random variables, discrete or continuous. The conditional probability distribution of the random variable Y given that X = x, is given by
f (y / x) = f(x, y)g(x) > 0
g(x)
(pure function of y)
Similarly, the conditional probability distribution of the random variable X given that Y = y, is given by
f (x / y) = f(x, y)h(y) > 0
h(y)
(pure function of x)
Note:f (x / y) only gives P ( X = x / Y = y). If one wishes to find the probability that the discrete random variable x falls between a and b when it is known that the discrete variable Y = y, then we evaluate
P (a < x < b / Y = y) = f (x / y)
Similarly,
P (a < y < b / X = x) = f (x / y)
For the continuous case:
P (a < x < b / Y = y) = f (x / y) dx
P (a < y < b / X = x) = f (y / x) dy
Example 5:
Find the conditional probability distribution of X, given that Y = 1 for Example 1 and use it to evaluate P (x = 0 / y = 1).
STATISTICAL INDEPENDENCE
Recall:P (B / A) = P(A B)
P(A)
P(A B) = P(A) * P (B / A)
P(A B) = P(A) * P (B) if A and B are statistically independent
Similarly,
f(y / x) = f(x, y)
g(x)
f(x, y) = g(x) * f (y / x)
f(x, y) = g(x) * h(y)if X and Y are statistically independent
OR:f(y / x) = f(x, y)
g(x)
f(x, y) = g(x) * f (y / x)
h(y) = f(x, y) dx= g(x) * f(y / x) dx
pure function of y
if x and y are independent
h(y) = f (y / x) g(x) dx
h(y) = f(x, y) / g(x)
f(x, y) = g(x) * h(y)
Let X and Y be two random variables, discrete or continuous, with joint probability distribution f(x, y) and marginal distributions g(x) and h(y), respectively. The random variable X and Y are said to be statistically independent if and only if
f(x, y) = g(x) * h(y)for all (x, y) within their range
QUAMETH Notes: Joint Probability DistributionPage 1 of 4