Deck 11: Further Issues in Using Ols With Time Sries Data

ملء الشاشة (f)
exit full mode
سؤال
Which of the following is a strong assumption for static and finite distributed lag models?​

A)​Sequential exogeneity
B)Strict exogeneity
C)​Dynamic completeness
D)Homoskedasticity
استخدم زر المسافة أو
up arrow
down arrow
لقلب البطاقة.
سؤال
If a process is said to be integrated of order one, or I(1), _____.

A)it is stationary at level
B)averages of such processes already satisfy the standard limit theorems
C)the first difference of the process is weakly dependent
D)it does not have a unit root
سؤال
A stochastic process {xt: t = 1,2,….} with a finite second moment [E(xt2) < <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. <div style=padding-top: 35px> ] is covariance stationary if:

A)E(xt) is variable, Var(xt) is variable, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. <div style=padding-top: 35px> 1, Cov(xt, xt+h) depends only on 'h' and not on 't'.
B)E(xt) is variable, Var(xt) is variable, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. <div style=padding-top: 35px> 1, Cov(xt, xt+h) depends only on 't' and not on h.
C)E(xt) is constant, Var(xt) is constant, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. <div style=padding-top: 35px> 1, Cov(xt, xt+h) depends only on 'h' and not on 't'.
D)E(xt) is constant, Var(xt) is constant, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. <div style=padding-top: 35px> 1, Cov(xt, xt+h) depends only on 't' and not on 'h'.
سؤال
A process is stationary if:

A)any collection of random variables in a sequence is taken and shifted ahead by h time periods; the joint probability distribution changes.
B)any collection of random variables in a sequence is taken and shifted ahead by h time periods, the joint probability distribution remains unchanged.
C)there is serial correlation between the error terms of successive time periods and the explanatory variables and the error terms have positive covariance.
D)there is no serial correlation between the error terms of successive time periods and the explanatory variables and the error terms have positive covariance.
سؤال
Consider the model: yt = <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . <div style=padding-top: 35px> 0 + <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . <div style=padding-top: 35px> 1zt1 + <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . <div style=padding-top: 35px> 2zt2 + ut. Under weak dependence, the condition sufficient for consistency of OLS is:

A)E(zt1|zt2) = 0.
B)E(yt |zt1, zt2) = 0.
C)E(ut |zt1, zt2) = 0.
D)E(ut |zt1, zt2) = <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . <div style=padding-top: 35px> .
سؤال
Which of the following statements is true of dynamically complete models?

A)There is scope of adding more lags to the model to better forecast the dependent variable.
B)The problem of serial correlation does not exist in dynamically complete models.
C)All econometric models are dynamically complete.
D)Sequential endogeneity is implied by dynamic completeness..
سؤال
Covariance stationarity focuses only on the first two moments of a stochastic process.
سؤال
In the model yt = <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. <div style=padding-top: 35px> 0 + <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. <div style=padding-top: 35px> 1xt1 + <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. <div style=padding-top: 35px> 2xt2 + ….. + <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. <div style=padding-top: 35px> kxtk + ut, the explanatory variables, xt = (xt1, xt2 …., xtk), are sequentially exogenous if:

A)E(ut|xt , xt-1, ……) = E(ut) = 0, t = 1,2, ….
B)E(ut|xt , xt-1, ……) <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. <div style=padding-top: 35px> E(ut) = 0, t = 1,2, ….
C)E(ut|xt , xt-1, ……) = E(ut) > 0, t = 1,2, ….
D)E(ut|xt , xt-1, ……) = E(ut) = 1, t = 1,2, ….
سؤال
The model yt = yt - 1 + et, t = 1, 2, … represents a:

A)AR(2) process.
B)MA(1) process.
C)random walk process.
D)random walk with a drift process.
سؤال
Which of the following statements is true?

A)A model with a lagged dependent variable cannot satisfy the strict exogeneity assumption.
B)Stationarity is critical for OLS to have its standard asymptotic properties.
C)Efficient static models can be estimated for nonstationary time series.
D)In an autoregressive model, the dependent variable in the current time period varies with the error term of previous time periods.
سؤال
Suppose ut is the error term for time period 't' in a time series regression model the explanatory variables are xt = (xt1, xt2 …., xtk). The assumption that the errors are contemporaneously homoskedastic implies that:

A)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . <div style=padding-top: 35px> .
B)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . <div style=padding-top: 35px> .
C)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . <div style=padding-top: 35px> 2.
D)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . <div style=padding-top: 35px> .
سؤال
Weakly dependent processes are said to be integrated of order zero.
سؤال
Which of the following statements is true?

A)A random walk process is stationary.
B)The variance of a random walk process increases as a linear function of time.
C)Adding a drift term to a random walk process makes it stationary.
D)The variance of a random walk process with a drift decreases as an exponential function of time.
سؤال
The model xt = <strong>The model x<sub>t</sub> =   <sub>1</sub>x<sub>t -</sub> <sub>1 </sub>+ e<sub>t</sub>, t =1,2,…. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)moving average process of order one. B)moving average process of order two. C)autoregressive process of order one. D)autoregressive process of order two. <div style=padding-top: 35px> 1xt - 1 + et, t =1,2,…. , where et is an i.i.d. sequence with zero mean and variance <strong>The model x<sub>t</sub> =   <sub>1</sub>x<sub>t -</sub> <sub>1 </sub>+ e<sub>t</sub>, t =1,2,…. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)moving average process of order one. B)moving average process of order two. C)autoregressive process of order one. D)autoregressive process of order two. <div style=padding-top: 35px> 2e represents a(n):

A)moving average process of order one.
B)moving average process of order two.
C)autoregressive process of order one.
D)autoregressive process of order two.
سؤال
Unit root processes, such as a random walk (with or without drift), are said to be:​

A)​integrated of order one.
B)​integrated of order two.
C)​sequentially exogenous.
D)​asymptotically uncorrelated.
سؤال
Covariance stationary sequences where Corr(xt + xt+h) <strong>Covariance stationary sequences where Corr(xt + xt+h)   0 as   are said to be:</strong> A)​unit root processes. B)trend-stationary processes. C)​serially uncorrelated. D)asymptotically uncorrelated. <div style=padding-top: 35px> 0 as <strong>Covariance stationary sequences where Corr(xt + xt+h)   0 as   are said to be:</strong> A)​unit root processes. B)trend-stationary processes. C)​serially uncorrelated. D)asymptotically uncorrelated. <div style=padding-top: 35px> are said to be:

A)​unit root processes.
B)trend-stationary processes.
C)​serially uncorrelated.
D)asymptotically uncorrelated.
سؤال
If ut refers to the error term at time 't' and yt - 1 refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:

A)Var(ut|yt - 1) > Var(yt|yt-1) = <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. <div style=padding-top: 35px> 2.
B)Var(ut|yt - 1) = Var(yt|yt-1) > <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. <div style=padding-top: 35px> 2.
C)Var(ut|yt - 1) < Var(yt|yt-1) = <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. <div style=padding-top: 35px> 2.
D)Var(ut|yt - 1) = Var(yt|yt-1) = <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. <div style=padding-top: 35px> 2.
سؤال
Which of the following is assumed in time series regression?

A)There is no perfect collinearity between the explanatory variables.
B)The explanatory variables are contemporaneously endogenous.
C)The error terms are contemporaneously heteroskedastic.
D)The explanatory variables cannot have temporal ordering.
سؤال
The model yt = et + <strong>The model y<sub>t</sub> = e<sub>t</sub> +   <sub>1</sub>e<sub>t -</sub> <sub>1</sub> +   <sub>2</sub>e<sub>t -</sub> <sub>2</sub> , t = 1, 2, ….. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)static model. B)moving average process of order one. C)moving average process of order two. D)autoregressive process of order two. <div style=padding-top: 35px> 1et - 1 + <strong>The model y<sub>t</sub> = e<sub>t</sub> +   <sub>1</sub>e<sub>t -</sub> <sub>1</sub> +   <sub>2</sub>e<sub>t -</sub> <sub>2</sub> , t = 1, 2, ….. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)static model. B)moving average process of order one. C)moving average process of order two. D)autoregressive process of order two. <div style=padding-top: 35px> 2et - 2 , t = 1, 2, ….. , where et is an i.i.d. sequence with zero mean and variance <strong>The model y<sub>t</sub> = e<sub>t</sub> +   <sub>1</sub>e<sub>t -</sub> <sub>1</sub> +   <sub>2</sub>e<sub>t -</sub> <sub>2</sub> , t = 1, 2, ….. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)static model. B)moving average process of order one. C)moving average process of order two. D)autoregressive process of order two. <div style=padding-top: 35px> 2e represents a(n):

A)static model.
B)moving average process of order one.
C)moving average process of order two.
D)autoregressive process of order two.
سؤال
A covariance stationary time series is weakly dependent if:

A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> 0.
B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> .
C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> .
D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <div style=padding-top: 35px> .
سؤال
The homoskedasticity assumption in time series regression suggests that the variance of the error term cannot be a function of time.
سؤال
If a process is a covariance stationary process, then it will have a finite second moment.
سؤال
Which of the following is true if yt = <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) <div style=padding-top: 35px> + <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) <div style=padding-top: 35px> + <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) <div style=padding-top: 35px> + <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) <div style=padding-top: 35px> + ut is a dynamically complete model?

A)E(yt| xt , yt -1, xt -1) = E(yt| xt , yt -1, xt -1,……)
B)E(yt| xt , yt -1, xt -1) = 0
C)E(yt| xt , yt -1, xt -1) = E(yt -1) + E(xt) + E(xt -1)
D)E(yt| xt , yt -1, xt -1) = E(yt)
سؤال
Under adaptive expectations, the expected current value of a variable does not depend on a recently observed value of the variable.
سؤال
The first difference of an I(1) time series is weakly dependent.
سؤال
The variance of a random walk process decreases as a linear function of time.​
سؤال
If adding one more lag of the dependent variable would explain the dependent variable better, then the model is not dynamically complete.
سؤال
Sequential exogeneity is implied by dynamic completeness.
فتح الحزمة
قم بالتسجيل لفتح البطاقات في هذه المجموعة!
Unlock Deck
Unlock Deck
1/28
auto play flashcards
العب
simple tutorial
ملء الشاشة (f)
exit full mode
Deck 11: Further Issues in Using Ols With Time Sries Data
1
Which of the following is a strong assumption for static and finite distributed lag models?​

A)​Sequential exogeneity
B)Strict exogeneity
C)​Dynamic completeness
D)Homoskedasticity
C
2
If a process is said to be integrated of order one, or I(1), _____.

A)it is stationary at level
B)averages of such processes already satisfy the standard limit theorems
C)the first difference of the process is weakly dependent
D)it does not have a unit root
C
3
A stochastic process {xt: t = 1,2,….} with a finite second moment [E(xt2) < <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. ] is covariance stationary if:

A)E(xt) is variable, Var(xt) is variable, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. 1, Cov(xt, xt+h) depends only on 'h' and not on 't'.
B)E(xt) is variable, Var(xt) is variable, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. 1, Cov(xt, xt+h) depends only on 't' and not on h.
C)E(xt) is constant, Var(xt) is constant, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. 1, Cov(xt, xt+h) depends only on 'h' and not on 't'.
D)E(xt) is constant, Var(xt) is constant, and for any t, h <strong>A stochastic process {x<sub>t</sub>: t = 1,2,….} with a finite second moment [E(x<sub>t</sub><sup>2</sup>) <   ] is covariance stationary if:</strong> A)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. B)E(x<sub>t</sub>) is variable, Var(x<sub>t</sub>) is variable, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on h. C)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 'h' and not on 't'. D)E(x<sub>t</sub>) is constant, Var(x<sub>t</sub>) is constant, and for any t, h   1, Cov(x<sub>t</sub>, x<sub>t+h</sub>) depends only on 't' and not on 'h'. 1, Cov(xt, xt+h) depends only on 't' and not on 'h'.
C
4
A process is stationary if:

A)any collection of random variables in a sequence is taken and shifted ahead by h time periods; the joint probability distribution changes.
B)any collection of random variables in a sequence is taken and shifted ahead by h time periods, the joint probability distribution remains unchanged.
C)there is serial correlation between the error terms of successive time periods and the explanatory variables and the error terms have positive covariance.
D)there is no serial correlation between the error terms of successive time periods and the explanatory variables and the error terms have positive covariance.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
5
Consider the model: yt = <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . 0 + <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . 1zt1 + <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . 2zt2 + ut. Under weak dependence, the condition sufficient for consistency of OLS is:

A)E(zt1|zt2) = 0.
B)E(yt |zt1, zt2) = 0.
C)E(ut |zt1, zt2) = 0.
D)E(ut |zt1, zt2) = <strong>Consider the model: y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>z<sub>t</sub><sub>1</sub> +   <sub>2</sub>z<sub>t</sub><sub>2</sub> + u<sub>t</sub>. Under weak dependence, the condition sufficient for consistency of OLS is:</strong> A)E(z<sub>t</sub><sub>1</sub>|z<sub>t</sub><sub>2</sub>) = 0. B)E(y<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. C)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) = 0. D)E(u<sub>t</sub> |z<sub>t</sub><sub>1</sub>, z<sub>t</sub><sub>2</sub>) =   . .
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
6
Which of the following statements is true of dynamically complete models?

A)There is scope of adding more lags to the model to better forecast the dependent variable.
B)The problem of serial correlation does not exist in dynamically complete models.
C)All econometric models are dynamically complete.
D)Sequential endogeneity is implied by dynamic completeness..
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
7
Covariance stationarity focuses only on the first two moments of a stochastic process.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
8
In the model yt = <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. 0 + <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. 1xt1 + <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. 2xt2 + ….. + <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. kxtk + ut, the explanatory variables, xt = (xt1, xt2 …., xtk), are sequentially exogenous if:

A)E(ut|xt , xt-1, ……) = E(ut) = 0, t = 1,2, ….
B)E(ut|xt , xt-1, ……) <strong>In the model y<sub>t</sub> =   <sub>0</sub> +   <sub>1</sub>x<sub>t</sub><sub>1</sub> +   <sub>2</sub>x<sub>t</sub><sub>2</sub> + ….. +   <sub>k</sub>x<sub>tk</sub> + u<sub>t</sub>, the explanatory variables, x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>), are sequentially exogenous if:</strong> A)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 0, t = 1,2, …. B)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……)   E(u<sub>t</sub>) = 0, t = 1,2, …. C)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) > 0, t = 1,2, …. D)E(u<sub>t</sub>|xt , xt-<sub>1</sub>, ……) = E(u<sub>t</sub>) = 1, t = 1,2, …. E(ut) = 0, t = 1,2, ….
C)E(ut|xt , xt-1, ……) = E(ut) > 0, t = 1,2, ….
D)E(ut|xt , xt-1, ……) = E(ut) = 1, t = 1,2, ….
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
9
The model yt = yt - 1 + et, t = 1, 2, … represents a:

A)AR(2) process.
B)MA(1) process.
C)random walk process.
D)random walk with a drift process.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
10
Which of the following statements is true?

A)A model with a lagged dependent variable cannot satisfy the strict exogeneity assumption.
B)Stationarity is critical for OLS to have its standard asymptotic properties.
C)Efficient static models can be estimated for nonstationary time series.
D)In an autoregressive model, the dependent variable in the current time period varies with the error term of previous time periods.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
11
Suppose ut is the error term for time period 't' in a time series regression model the explanatory variables are xt = (xt1, xt2 …., xtk). The assumption that the errors are contemporaneously homoskedastic implies that:

A)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . .
B)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . .
C)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . 2.
D)Var(ut|xt) = <strong>Suppose u<sub>t</sub> is the error term for time period 't' in a time series regression model the explanatory variables are x<sub>t</sub> = (x<sub>t</sub><sub>1</sub>, x<sub>t</sub><sub>2</sub> …., x<sub>tk</sub>). The assumption that the errors are contemporaneously homoskedastic implies that:</strong> A)Var(u<sub>t</sub>|x<sub>t</sub>) =   . B)Var(u<sub>t</sub>|xt) =   . C)Var(u<sub>t</sub>|xt) =   <sup>2</sup>. D)Var(u<sub>t</sub>|x<sub>t</sub>) =   . .
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
12
Weakly dependent processes are said to be integrated of order zero.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
13
Which of the following statements is true?

A)A random walk process is stationary.
B)The variance of a random walk process increases as a linear function of time.
C)Adding a drift term to a random walk process makes it stationary.
D)The variance of a random walk process with a drift decreases as an exponential function of time.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
14
The model xt = <strong>The model x<sub>t</sub> =   <sub>1</sub>x<sub>t -</sub> <sub>1 </sub>+ e<sub>t</sub>, t =1,2,…. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)moving average process of order one. B)moving average process of order two. C)autoregressive process of order one. D)autoregressive process of order two. 1xt - 1 + et, t =1,2,…. , where et is an i.i.d. sequence with zero mean and variance <strong>The model x<sub>t</sub> =   <sub>1</sub>x<sub>t -</sub> <sub>1 </sub>+ e<sub>t</sub>, t =1,2,…. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)moving average process of order one. B)moving average process of order two. C)autoregressive process of order one. D)autoregressive process of order two. 2e represents a(n):

A)moving average process of order one.
B)moving average process of order two.
C)autoregressive process of order one.
D)autoregressive process of order two.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
15
Unit root processes, such as a random walk (with or without drift), are said to be:​

A)​integrated of order one.
B)​integrated of order two.
C)​sequentially exogenous.
D)​asymptotically uncorrelated.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
16
Covariance stationary sequences where Corr(xt + xt+h) <strong>Covariance stationary sequences where Corr(xt + xt+h)   0 as   are said to be:</strong> A)​unit root processes. B)trend-stationary processes. C)​serially uncorrelated. D)asymptotically uncorrelated. 0 as <strong>Covariance stationary sequences where Corr(xt + xt+h)   0 as   are said to be:</strong> A)​unit root processes. B)trend-stationary processes. C)​serially uncorrelated. D)asymptotically uncorrelated. are said to be:

A)​unit root processes.
B)trend-stationary processes.
C)​serially uncorrelated.
D)asymptotically uncorrelated.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
17
If ut refers to the error term at time 't' and yt - 1 refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:

A)Var(ut|yt - 1) > Var(yt|yt-1) = <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. 2.
B)Var(ut|yt - 1) = Var(yt|yt-1) > <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. 2.
C)Var(ut|yt - 1) < Var(yt|yt-1) = <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. 2.
D)Var(ut|yt - 1) = Var(yt|yt-1) = <strong>If u<sub>t</sub> refers to the error term at time 't' and y<sub>t -</sub> <sub>1</sub> refers to the dependent variable at time 't - 1', for an AR(1) process to be homoskedastic, it is required that:</strong> A)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) > Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. B)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) >   <sup>2</sup>. C)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) < Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. D)Var(u<sub>t</sub>|y<sub>t -</sub> <sub>1</sub>) = Var(y<sub>t</sub>|y<sub>t-</sub><sub>1</sub>) =   <sup>2</sup>. 2.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
18
Which of the following is assumed in time series regression?

A)There is no perfect collinearity between the explanatory variables.
B)The explanatory variables are contemporaneously endogenous.
C)The error terms are contemporaneously heteroskedastic.
D)The explanatory variables cannot have temporal ordering.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
19
The model yt = et + <strong>The model y<sub>t</sub> = e<sub>t</sub> +   <sub>1</sub>e<sub>t -</sub> <sub>1</sub> +   <sub>2</sub>e<sub>t -</sub> <sub>2</sub> , t = 1, 2, ….. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)static model. B)moving average process of order one. C)moving average process of order two. D)autoregressive process of order two. 1et - 1 + <strong>The model y<sub>t</sub> = e<sub>t</sub> +   <sub>1</sub>e<sub>t -</sub> <sub>1</sub> +   <sub>2</sub>e<sub>t -</sub> <sub>2</sub> , t = 1, 2, ….. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)static model. B)moving average process of order one. C)moving average process of order two. D)autoregressive process of order two. 2et - 2 , t = 1, 2, ….. , where et is an i.i.d. sequence with zero mean and variance <strong>The model y<sub>t</sub> = e<sub>t</sub> +   <sub>1</sub>e<sub>t -</sub> <sub>1</sub> +   <sub>2</sub>e<sub>t -</sub> <sub>2</sub> , t = 1, 2, ….. , where e<sub>t</sub> is an i.i.d. sequence with zero mean and variance   <sup>2</sup>e represents a(n):</strong> A)static model. B)moving average process of order one. C)moving average process of order two. D)autoregressive process of order two. 2e represents a(n):

A)static model.
B)moving average process of order one.
C)moving average process of order two.
D)autoregressive process of order two.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
20
A covariance stationary time series is weakly dependent if:

A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . 0.
B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . .
C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . .
D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . as h <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . <strong>A covariance stationary time series is weakly dependent if:</strong> A)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to   as h   0. B)the correlation between the independent variable at time 't' and the dependent variable at time 't + h' goes to 0 as h     . C)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to 0 as h     . D)the correlation between the independent variable at time 't' and the independent variable at time 't + h' goes to   as h     . .
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
21
The homoskedasticity assumption in time series regression suggests that the variance of the error term cannot be a function of time.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
22
If a process is a covariance stationary process, then it will have a finite second moment.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
23
Which of the following is true if yt = <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) + <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) + <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) + <strong>Which of the following is true if y<sub>t</sub> =   +   +   +   + u<sub>t</sub> is a dynamically complete model?</strong> A)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1,</sub>……) B)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = 0 C)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t -1</sub>) + E(x<sub>t</sub>) + E(x<sub>t -1</sub>) D)E(y<sub>t</sub>| x<sub>t</sub> , y<sub>t -1</sub>, x<sub>t -1</sub>) = E(y<sub>t</sub>) + ut is a dynamically complete model?

A)E(yt| xt , yt -1, xt -1) = E(yt| xt , yt -1, xt -1,……)
B)E(yt| xt , yt -1, xt -1) = 0
C)E(yt| xt , yt -1, xt -1) = E(yt -1) + E(xt) + E(xt -1)
D)E(yt| xt , yt -1, xt -1) = E(yt)
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
24
Under adaptive expectations, the expected current value of a variable does not depend on a recently observed value of the variable.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
25
The first difference of an I(1) time series is weakly dependent.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
26
The variance of a random walk process decreases as a linear function of time.​
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
27
If adding one more lag of the dependent variable would explain the dependent variable better, then the model is not dynamically complete.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
28
Sequential exogeneity is implied by dynamic completeness.
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.
فتح الحزمة
k this deck
locked card icon
فتح الحزمة
افتح القفل للوصول البطاقات البالغ عددها 28 في هذه المجموعة.