You are on page 1of 26

Quantitative Techniques

Topic 5
Multiple Regression Analysis

Reading: GJ (Ch. 7 & 8);

1
Two Explanatory Variables
(Three Variable Regression Model)
Yi = 1 + 2X2i + 3X3i + ui

Xis affect Yi Yi Yi
= 2 = 3
separately X2i X3i

2
The General Model
Yi = 1 + 2X2i + 3X3i +. . .+ KXKi + ui

The parameter 1 is the intercept (constant) term.


The variable attached to 1 is X1i= 1.

Usually, the number of explanatory variables


is said to be k1 (ignoring X1i= 1), while the
number of parameters is k. (Namely: 1 . . . K).
3
Assumptions of Multiple
Linear Regression Model
1. E(ui|X2i, X3i) = 0
2. var(ui) = 2
3. Cov(ui,uj) = for i j
4. Cov(ui,X2i) = Cov(ui,X3i) =
5. Xi and Xj are not perfectly correlated
6. Model is correctly specified
7. ui ~ N(0, 2)
4
Least Squares Estimation

The set of normal equations are:

5
Least Squares Estimators

6
Variance of OLS Estimators

When r23 = 0
these reduce
to the simple
regression
Where:
formulas.

7
Properties of OLS Estimators

8
Gauss-Markov Theorem
Under the assumptions of the
multiple regression model, the
ordinary least squares estimators
have the smallest variance of
all linear and unbiased estimators.
This means that the least squares
estimators are the Best Linear
Unbiased Estimators (BLUE).
9
Normal

This implies and is implied by: ui ~ N(0, )


2

z = ~ N(0,1) for k = 1,2,...,K


10
Student-t

^ k)
var(b ^

t = =

t has a Student-t distribution with df=(nK).


11
Goodness - of - Fit
Coefficient of Determination

2 ESS (Yi Y)
i=1
^ 2

R = TSS = n

(Yi Y)
i= 1
2
2
0< R <1 12
Adjusted R-Squared
Adjusted Coefficient of Determination
Original:
2
R = ESS
TSS
= 1 RSS
TSS
Adjusted:
R = 1
2 RSS/(nk)
TSS/(n1) 13
Examples of Multiple Variable
Regression Model
1) Cobb Douglas production function:

2) Polynomial regression models:

14
Hypothesis Testing in Multiple
Variable Regression Model
1. Testing hypotheses about an individual partial regression coefficient.

2. Testing the overall significance of the estimated multiple regression


model, that is, finding out if all the partial slope coefficients are
simultaneously equal to zero.

3. Testing that two or more coefficients are equal to one another.

4. Testing that the partial regression coefficients satisfy certain


restrictions.

5. Testing the stability of the estimated regression model over time or in


different cross-sectional units.

6. Testing the functional form of regression models.


15
Student - t Test
Yi = 1 + 2X2i + 3X3i + 4X4i + u^i
Student-t tests can be used to test any linear
combination of the regression coefficients:
H0: 1 = 0 H0: 2 + 3 + 4 = 1

Every such t-test has exactly nK degrees of freedom


where K=#coefficients estimated(including the intercept).

16
Interval Estimation

P tc < < tc = 1

tc is critical value for (n-K) degrees of freedom


such that P( t > tc ) = /2.

Interval endpoints:
17
One Tail Test
Yi = 1 + 2X2i + 3X3i + 4X4i + ui

H0: 3 < 0
t= ~ t (nK)
H1: 3 > 0
df = nK
= n4

0 tc 18
Two Tail Test
Yi = 1 +2X2i + 3X3i + 4X4i + ui

H0: 2 = 0
t= ~ t (nK)
H1: 2 = 0
df = nK
= n4

-tc 0 tc 19
F-Test of Entire Equation
Yt = 1 + 2X2i + 3X3i + u^i

We ignore 1. Why? H0: 2 = 3 = 0


H1: H0 not true

20
F-Test of Entire Equation
Alternative specification for the F-test:

21
Multiple Restriction F-Test

H0: 2 = 0, 4 = 0 (RSSR RSSU)/J


F =
H1: H0 not true RSSU/(nk)

First run the restricted dfn = J (# of restrictions)


regression by dropping dfd = nkdf in unrestricted
model)
X2i and X4i to get RSSR.
Next run unrestricted regression to get RSSU . 22
F-Tests
F-Tests of this type are always right-tailed,
even for left-sided or two-sided hypotheses,
f(F) because any deviation from the null will
make the F value bigger (move rightward).

(RSSR RSSU)/J
F =
RSSU/(nk)

0 Fc F 23
F-Test of Entire Equation (revisited)

Yt = 1 + 2X2i + 3X3i + u^i

We ignore 1. Why? H0: 2 = 3 = 0


(RSSR RSSU)/J
H1: H0 not true
F =
RSSU/(nk)
dfn = J
dfd = nk
= 0.05
Reject H0! 24
Testing for equality between
two regression coefficients

25
Nonsample Information
A certain production process is known to be
Cobb-Douglas with constant returns to scale.
ln(Yi) = 1 + 2 ln(X2i) + 3 ln(X3i) + 4 ln(X4i) + ui
where 2 + 3 + 4 = 1 4 = (1 2 3)

ln(Yi /X4i) = 1 + 2 ln(X2i/X4i) + 3 ln(X3i /X4i) + ui


Y*t = 1 + 2 X*2i+ 3 X*3i + 4 X*4i + ut
Run least squares on the transformed model.
Interpret coefficients same as in original model.
26

You might also like