
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Edition 3ISBN: 978-9352863501
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Edition 3ISBN: 978-9352863501 Exercise 16
This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i)
, where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption.
a. Use the expression for
given in Exercise 18.6 to write
- ß =
.
b. Show that
where
=
, and so forth. [The matrix
if
: for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.]
c. Show that assumptions (i) and (ii) imply that
.
d. Use (c) and the law of iterated expectations to show that
e. Use (a) through (d) to conclude that, under conditions (i) through
(iv)![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_8d77_bf3e_eb3b803abbda_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6b_ca1c_bf3e_053c6dc9ee6d_SM2686_00.jpg)
a. Use the expression for
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6b_ca1d_bf3e_b982e9eb6f58_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6b_ca1e_bf3e_8dedd101b6d4_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6b_f12f_bf3e_050c164ac814_SM2686_11.jpg)
b. Show that
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6b_f130_bf3e_ef447f25e9df_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6b_f131_bf3e_39a8cb741926_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_1842_bf3e_61a91b32d829_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_1843_bf3e_355e939e0c7a_SM2686_11.jpg)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_6664_bf3e_0fa3df5ceb14_SM2686_11.jpg)
c. Show that assumptions (i) and (ii) imply that
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_6665_bf3e_93d779bfdd73_SM2686_11.jpg)
d. Use (c) and the law of iterated expectations to show that
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_6666_bf3e_ad66deda9ec1_SM2686_11.jpg)
e. Use (a) through (d) to conclude that, under conditions (i) through
(iv)
![This exercise shows that the OLS estimator of a subset of the regression coefficients is consistent under the conditional mean independence assumption stated in Appendix 7.2. Consider the multiple regression model in matrix form Y=Xß + Wy + u, where X and W are, respectively, n × k 1 and n × k 2 matrices of regressors. Let X i and W i denote the i th rows of X and W [as in Equation (18.3)]. Assume that (i) , where is a k 2 × 1 vector of unknown parameters; (ii) (Xi, W i Yi) are i.i.d.; (iii) (X i W i u i ) have four finite, nonzero moments; and (iv) there is no perfect multicollinearity. These are Assumptions #l-#4 of Key Concept 18.1, with the conditional mean independence assumption (i) replacing the usual conditional mean zero assumption. a. Use the expression for given in Exercise 18.6 to write - ß = . b. Show that where = , and so forth. [The matrix if : for all i,j, where A n,ij and A ij are the (i, j) elements of A n and A.] c. Show that assumptions (i) and (ii) imply that . d. Use (c) and the law of iterated expectations to show that e. Use (a) through (d) to conclude that, under conditions (i) through (iv)](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f6c_8d77_bf3e_eb3b803abbda_SM2686_11.jpg)
Explanation
a) The given regression equation is The...
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Why don’t you like this exercise?
Other Minimum 8 character and maximum 255 character
Character 255