2011-09-27 Multivariate Gaussian Regressionp.1

General formulas for marginal means and variances:

Formulas forconditional means and variances, multivariate Gaussian

If

then

where

 IMPORTANT FORMULA

and

 IMPORTANT FORMULA

See George Seber Linear Regression Analysis Exercise 2c #2

and C.R.Rao Linear Models problems 2.7 and 2.9

Motivation:

Suppose

Let .

Then marginally

and conditionally

Write

So marginally

Note that

Call this , and .

Then

and

and

This confirms the equations on the first page!

Bivariate view


Exercises: What are the dimensions of each matrix?

Check the conformability of the matrix multiplications.

Memorize formulas for conditional Gaussian means and variances.

Memorize formulas for marginal means and variances.

Application to Bayes Posterior.

Magic! No need to complete the square, much less tedium, less algebra, less likely to make a mistake.

OBOY, a useful computing trick!!

What to do when

For computation, instead of

the following alternative is used:

This involves only inverse, except , which is generally simple.

Notice that this has a flavor of “the posterior precision is the sum of the prior and model precision”.

Derivation: See document on the web site.

Application to multivariate regression: generalization for higher dimensions:

Suppose , a regression model where Z is the design matrix.

Exercise (yes please, on paper and hand in for next Tuesday):

Under "motivation", for the regression model, you can see the mean vector and variance matrix for Y1 and also for Y2.

Now write the joint distribution of the predictor Y1 and the outcome Y2.

(Only one thing you have to supply – the covariance.)

Next, starting with this joint distribution, using the formulas, derive the conditional mean and variance for Y2|Y1.

(The calibration problem is the reverse conditioning: the conditional mean and variance for Y1|Y2!)

For next Thursday 9/30/2009: Starting with the “multivariate regression” example, usethe joint distribution of Xand θ, together with our formulas for the conditional mean and variance, tocalculate the posterior mean and variance of the regression parameter vectorθ.

Contemplate what it would be like to try to do this without matrix algebra!!!!!!! Ouch.

This view of linear regression will be enormously helpful when we deal with hierarchical models and empirical Bayes. There are also strong connections with ridge regression, regularization, calibration problems, Gaussian networks, and many other important things.