Some details to tidy up Jan 23 2019

Summary of last week

For the linear regression model Y=Xβ+ϵ where E(ϵ)=0n×1, Var(ϵ)=σ2In, and the matrix Xn×p is fixed with rank p.

The least squares estimates are β^=(XTX)1XTY

Furthermore, the least squares estimates are BLUE, and E(β^)=β,Var(β^)=σ2(XTX)1

We have not used any Normality assumptions to show these properties.

Today

Go over the estimation of σ

Strategy: Write ei2 as a linear combination of uncorrelated variables, ϵi.

Write correlated residuals as combination of uncorrelated errors

Claim:

||e||2=ϵT(IH)ϵ

Your turn at home:

  1. Show (IH)ϵ=e. Hint: substitute ϵ=YXβ, expand and use properties of H.

  2. Show ||e||2=eTe=ϵT(IH)ϵ. Hint: substitute in e=(IH)ϵ from above and use properties of (IH).

Find expected value of ||e||2 in terms of trace(IH)

Show E(ϵT(IH)ϵ)=σ2trace(IH)

Hint xTAx=i=1nj=1nxixjAij where x=(x1,x2,,xn)T,A=(A11A12A21A22)n×n

Find expected value of ||e||2 in terms of trace(IH)

E(ϵT(IH)ϵ)=

Find trace(IH)

Show trace(IH)=np

Hint: trace(A+B)=trace(A)+trace(B)trace(AB)=trace(BA)

trace(IH)=aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

Put it all together

E(σ^2)=

Inference on the regression coefficients

Normality assumption

Assume ϵN(0,σ2I).

Important reminders:

Leads to: YN(,)

β^N(,)

Inference on individual parameters

With the addition of the Normal assumption, it can be shown that

βj^βjSE(βj^)tnp

leads to the usual construction of tests and confidence intervals for single parameters.

Exercises

See handout.