
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Edition 3ISBN: 978-9352863501
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Edition 3ISBN: 978-9352863501 Exercise 14
Consider the regression model without an intercept term,
(so the true value of the intercept, 0 , is zero).
a. Derive the least squares estimator of 1 for the restricted regression model
This is called the restricted least squares estimator
of 1 because it is estimated under a restriction, which in this case is 0 = 0.
b. Derive the asymptotic distribution of
under Assumptions #1 through #3 of Key Concept 17.1.
c. Show that
is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)].
d. Derive the conditional variance of
under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1).
e. Compare the conditional variance of
in (d) to the conditional variance of the OLS estimator
(from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why.
f. Derive the exact sampling distribution of
under Assumptions #1 through #5 of Key Concept 17.1.
g. Now consider the estimator
Derive an expression for
under the Gauss-Markov conditions and use this expression to show that
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3c_2efb_bf3e_b523f6f3a2ff_SM2686_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_6b9d_bf3e_0bb054958cbd_SM2686_11.jpg)
a. Derive the least squares estimator of 1 for the restricted regression model
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_92ae_bf3e_65f3c9af5ca0_SM2686_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_92af_bf3e_ef6b539768fd_SM2686_11.jpg)
b. Derive the asymptotic distribution of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_b9c0_bf3e_5b5c322915a5_SM2686_11.jpg)
c. Show that
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_b9c1_bf3e_2dc9576765d1_SM2686_11.jpg)
d. Derive the conditional variance of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_b9c2_bf3e_399566f7405a_SM2686_11.jpg)
e. Compare the conditional variance of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_e0d3_bf3e_33fcaddc03f5_SM2686_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_e0d4_bf3e_dd7e5e523f68_SM2686_11.jpg)
f. Derive the exact sampling distribution of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_e0d5_bf3e_57bc7b0a8548_SM2686_11.jpg)
g. Now consider the estimator
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3b_e0d6_bf3e_7b1065afbf4a_SM2686_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3c_07e7_bf3e_93f9faeb8fe9_SM2686_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3c_07e8_bf3e_8f7ce4d39205_SM2686_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3c_07e9_bf3e_df226b07be49_SM2686_00.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3c_2efa_bf3e_c50f34f675b1_SM2686_00.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2686/11eb9b5b_3f3c_2efb_bf3e_b523f6f3a2ff_SM2686_11.jpg)
Explanation
a) The restricted regression model is gi...
Introduction to Econometrics 3rd Edition by James Stock, James Stock
Why don’t you like this exercise?
Other Minimum 8 character and maximum 255 character
Character 255