251prob10/27/04 (Open this document in 'Outline' view!)

H. Introduction to Probability

1. Experiments and Probability

Define a random experiment, a sample space, an outcome, a basic outcome

a. Definition and rules for Statistical Probability.

(i) If is impossible,.

(ii) If is certain,.

(iii) For any Outcome , .

(iv). If represent all possible outcomes and are mutually exclusive, then .

b. An Event.

c. Symmetrical, Statistical and Subjective Probability.

2. The Venn Diagram.

A diagram representing events as sets of points or ‘puddles.’

a. The Addition Rule. .

(i) Meaning of Union (or).

means the probability of occurring or of occurring or both. It always includes if it exists.

(ii) Meaning of Intersection (and).

means the probability of both and occurring. Note that if and are mutually exclusive

Diagram for dice problems.

b. Meaning of Complement. .

This event can be called . Note that if and are collectively exhaustive. If and are both collectively exhaustive and mutually exclusive is the complement of .

c. Extended Addition Rule.

3. Conditional and Joint Probability.

a. The Multiplication Rule or .

The conditional probability of given ,is the probability of event assuming that event B has occurred.

b. A Joint Probability Table.

What is the difference between joint, marginal and conditional probabilities? Remember that we cannot read a conditional probability directly from a joint probability table but must compute it using the second version of the Multiplication Rule.

c. Extended Multiplication Rule.

d. Bayes' Rule.

4. Statistical Independence.

a. Definition:

b. Consequence:

c. Consequence: If and are independent so are etc.

5. Review.

Rule / In General / andmutually exclusive / andindependent
Multiplication / / /
Addition / / /
Bayes' Rule / / /
Bayes' Rule / / /

I. Permutations and Combinations.

1. Counting Rule for Outcomes.

a. If an experiment has steps and there are possible outcomes on the first step, possible outcomes on the second step, etc. up to possible outcomes on the th step, then the total number of possible outcomes is the product .

b. Consequence. If there are exactly outcomes at each step, the total possible outcomes from steps is .

2. Permutations.

a. The number of ways that one can arrange n objects:

b. Order counts!

3. Combinations.

a. Order doesn't count!

b. Probability of getting a given combination

This is the number of ways of getting the specified combination divided by the total number of possible combinations. If there are equally likely ways to get what you want and equally likely possible outcomes, the probability of getting the outcomes you want is Example: If there is only one way to get 4 jacks from 4 jacks in a poker hand and ways to get another card,The number of ways to get a poker hand of 5 cards is so the probability of getting a poker hand with 4 jacks is .

J. Random Variables.

1. Definitions.

Discrete and Continuous Random Variables. Finite and infinite populations. Sampling with replacement.

2. Probability Distribution of a Discrete Random Variable.

By this we mean either a table with each possible value of a random variable and the probability of each value (These probabilities better add to one!) or a formula that will give us these results. We can still speak of Relative Frequency and define Cumulative Frequency to a point as the probability up to that point, i.e.

3. Expected Value (Expectation) of a Discrete Random Variable..

Rules for linear functions of a random variable:

and are constants. is a random variable.

a.

b.

c.

d.

4. Variance of a Discrete Random Variable.

Rules for linear functions of a random Variable:

a.

b.

c.

d.

Example -- see 251probex1.

5. Summary

a. Rules for Means and Variances of Functions of Random Variables. 251probex4

b. Standardized Random Variables, . See 251probex2.

6. Continuous Random Variables.

a. Normal Distribution (Overview).

The General formula is . Don’t try to memorize or even use this formula. It is much more important to remember what the normal curve looks like.

b. The Continuous Uniform Distribution.

otherwise.

and .

To find probabilities under this distribution, go to 251probex3.

c. Cumulative Distributions, Means and Variances for Continuous Distributions.

Discrete Distributions / Continuous Distributions
Cumulative Function / /
Mean / /
Variance /

/


Example: For the Continuous Uniform Distribution, (i) (ii) and (iii) .

The proofs below are intended only for those who have had calculus!

Proof:(i)

(ii)

(iii)

d. Chebyshef's Inequality Again.

and don't forget the Empirical Rule

Proof and extensions

7. Skewness and Kurtosis (Short Summary).

Skewness:. For a discrete distribution, this means , and, for a continuous distribution

Relative Skewness:

Kurtosis:. For the Normal distribution

Coefficient of Excess:

1