
Introduction to Econometrics 3rd Edition by James Stock, Mark Watson
Edition 3ISBN: 978-9352863501
Introduction to Econometrics 3rd Edition by James Stock, Mark Watson
Edition 3ISBN: 978-9352863501 Exercise 1
Consider the regression model without an intercept term,
(so the true value of the intercept, 0 , is zero).
a. Derive the least squares estimator of 1 for the restricted regression model
This is called the restricted least squares estimator
of 1 because it is estimated under a restriction, which in this case is 0 = 0.
b. Derive the asymptotic distribution of
under Assumptions #1 through #3 of Key Concept 17.1.
c. Show that
is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)].
d. Derive the conditional variance of
under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1).
e. Compare the conditional variance of
in (d) to the conditional variance of the OLS estimator
(from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why.
f. Derive the exact sampling distribution of
under Assumptions #1 through #5 of Key Concept 17.1.
g. Now consider the estimator
Derive an expression for
under the Gauss-Markov conditions and use this expression to show that
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bec0_84e6_ab464c108825_SM2685_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_97a5_84e6_613e7c2c90ab_SM2685_11.jpg)
a. Derive the least squares estimator of 1 for the restricted regression model
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_beb6_84e6_5dd6480072c6_SM2685_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_beb7_84e6_3540d6518c8a_SM2685_11.jpg)
b. Derive the asymptotic distribution of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_beb8_84e6_596e57d75601_SM2685_11.jpg)
c. Show that
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_beb9_84e6_ebc1560e1954_SM2685_11.jpg)
d. Derive the conditional variance of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_beba_84e6_dd793a53ace8_SM2685_11.jpg)
e. Compare the conditional variance of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bebb_84e6_ab573ee9941c_SM2685_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bebc_84e6_7b57d282df65_SM2685_11.jpg)
f. Derive the exact sampling distribution of
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bebd_84e6_63cbf018e598_SM2685_11.jpg)
g. Now consider the estimator
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bebe_84e6_d9c4cbc1b289_SM2685_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bebf_84e6_9d898f7611e4_SM2685_11.jpg)
![Consider the regression model without an intercept term, (so the true value of the intercept, 0 , is zero). a. Derive the least squares estimator of 1 for the restricted regression model This is called the restricted least squares estimator of 1 because it is estimated under a restriction, which in this case is 0 = 0. b. Derive the asymptotic distribution of under Assumptions #1 through #3 of Key Concept 17.1. c. Show that is linear [Equation (5.24)] and, under Assumptions #1 and #2 of Key Concept 17.1, conditionally unbiased [Equation (5.25)]. d. Derive the conditional variance of under the Gauss Maikov conditions (Assumptions #1 through #4 of Key Concept 17.1). e. Compare the conditional variance of in (d) to the conditional variance of the OLS estimator (from the regression including an intercept) under the Gauss-Markov conditions. Which estimator is more efficient Use the formulas for the variances to explain why. f. Derive the exact sampling distribution of under Assumptions #1 through #5 of Key Concept 17.1. g. Now consider the estimator Derive an expression for under the Gauss-Markov conditions and use this expression to show that](https://d2lvgg3v3hfg70.cloudfront.net/SM2685/11eb817c_780a_bec0_84e6_ab464c108825_SM2685_11.jpg)
Explanation
a) The restricted regression model is gi...
Introduction to Econometrics 3rd Edition by James Stock, Mark Watson
Why don’t you like this exercise?
Other Minimum 8 character and maximum 255 character
Character 255