expand icon
book Introduction to Econometrics 3rd Edition by James Stock, Mark Watson cover

Introduction to Econometrics 3rd Edition by James Stock, Mark Watson

Edition 3ISBN: 978-9352863501
book Introduction to Econometrics 3rd Edition by James Stock, Mark Watson cover

Introduction to Econometrics 3rd Edition by James Stock, Mark Watson

Edition 3ISBN: 978-9352863501
Exercise 11
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. where all variables are scalars and the constant term/intercept is omitted for convenience.
a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent.
b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. are i.i.d.
i. Show that the OLS estimator can be written as
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain.
ii. Suppose that data are "missing completely at random" in the sense that
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. where p is a constant. Show that
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. is unbiased and consistent.
iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. Show that
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. is unbiased and consistent.
iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. Is
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. unbiased Is
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. consistent Explain.
c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. unbiased Is
This exercise takes up the problem of missing data discussed in Section 9.2. Consider the regression model     where all variables are scalars and the constant term/intercept is omitted for convenience. a. Suppose that the least assumptions in Key Concept 4.3 are satisfied. Show that the least squares estimator of ß is unbiased and consistent. b. Now suppose that some of the observations are missing. Let I i , denote a binary random variable that indicates the nonmissing observations; that is, I i = 1 if observation i is not missing and I i = 0 if observation i is missing. Assume that     are i.i.d. i. Show that the OLS estimator can be written as     ii. Suppose that data are missing completely at random in the sense that     where p is a constant. Show that     is unbiased and consistent. iii. Suppose that the probability that the i th observation is missing depends of X i but not on u i ; that is,     Show that     is unbiased and consistent. iv. Suppose that the probability that the i th observation is missing depends on both X i and u i ; that is,     Is     unbiased Is     consistent Explain. c. Suppose that ß = 1 and that X i and u i are mutually independent standard normal random variables [so that both X t and iq are distributed N (0,1)]. Suppose that I i = 1 when Y i 0, but I i = 0 when Y i 0. Is     unbiased Is     consistent Explain. consistent Explain.
Explanation
Verified
like image
like image

a) If the least squares assumptions are ...

close menu
Introduction to Econometrics 3rd Edition by James Stock, Mark Watson
cross icon