
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Edition 3ISBN: 978-9352863501
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Edition 3ISBN: 978-9352863501 Exercise 16
This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)
, where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption.
a. Use the expression for
given in Exercise 18.6 to write
- ß =
.
b. Show that
where
=
, and so forth. [The matrix
if
: for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]
c. Show that assumptions (i) and (ii) imply that
.
d. Use (c) and the law of iterated expectations to show that
e. Use (a) through (d) to conclude that, under conditions (i) through
(iv)![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_8d77_bf3e_eb3b803abbda_SM2686_11.jpg)
, where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption.a. Use the expression for
given in Exercise 18.6 to write
- ß =
. b. Show that
where
=
, and so forth. [The matrix
if
: for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that
.d. Use (c) and the law of iterated expectations to show that
e. Use (a) through (d) to conclude that, under conditions (i) through
(iv)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_8d77_bf3e_eb3b803abbda_SM2686_11.jpg)
Explanation
a) The given regression equation is The...
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Why don’t you like this exercise?
Other Minimum 8 character and maximum 255 character
Character 255

