expand icon
book Introduction to Econometrics 3rd Edition by James Stock, James Stock cover

Introduction to Econometrics 3rd Edition by James Stock, James Stock

Edition 3ISBN: 978-9352863501
book Introduction to Econometrics 3rd Edition by James Stock, James Stock cover

Introduction to Econometrics 3rd Edition by James Stock, James Stock

Edition 3ISBN: 978-9352863501
Exercise 16
This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption.
a. Use the expression for This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  given in Exercise 18.6 to write This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  - ß = This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  .
b. Show that This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  where This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  = This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  , and so forth. [The matrix This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  if This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]
c. Show that assumptions (i) and (ii) imply that This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)  .
d. Use (c) and the law of iterated expectations to show that This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)
e. Use (a) through (d) to conclude that, under conditions (i) through
(iv) This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)   , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for   given in Exercise 18.6 to write   - ß =   .  b. Show that   where   =   , and so forth. [The matrix   if   : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]  c. Show that assumptions (i) and (ii) imply that   . d. Use (c) and the law of iterated expectations to show that    e. Use (a) through (d) to conclude that, under conditions (i) through (iv)
Explanation
Verified
like image
like image

a) The given regression equation is blured image The...

close menu
Introduction to Econometrics 3rd Edition by James Stock, James Stock
cross icon