Solved

(Requires Appendix Material) in Deriving the OLS Estimator, You Minimize β^0 and β^1\hat { \beta } _ { 0 } \text { and } \hat { \beta } _ { 1 }

Question 25

Essay

(Requires Appendix material) In deriving the OLS estimator, you minimize the sum of squared residuals with respect to the two parameters β^0 and β^1\hat { \beta } _ { 0 } \text { and } \hat { \beta } _ { 1 }
. The resulting two equations imply two restrictions that OLS places on the data, namely that i=1nu^i=0\sum _ { i = 1 } ^ { n } \hat { u } _ { i } = 0 and
i=1nu^iXi=0\sum _ { i = 1 } ^ { n } \hat { u } _ { i } X _ { i } = 0 Show that you get the same formula for the regression slope and the intercept if you impose these two conditions on the sample regression function.

Correct Answer:

verifed

Verified

Unlock this answer now
Get Access to more Verified Answers free of charge

Related Questions

Unlock this Answer For Free Now!

View this answer and more for free by performing one of the following actions

qr-code

Scan the QR code to install the App and get 2 free unlocks

upload documents

Unlock quizzes for free by uploading documents