Deck 3: Multiple Regression Analysis: Estimation
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Question
Unlock Deck
Sign up to unlock the cards in this deck!
Unlock Deck
Unlock Deck
1/24
Play
Full screen (f)
Deck 3: Multiple Regression Analysis: Estimation
1
Find the degrees of freedom in a regression model that has 10 observations and 7 independent variables.
A)17
B)2
C) 3
D) 4
A)17
B)2
C) 3
D) 4
B
Explanation: The degrees of freedom in a regression model is computed by subtracting the number of parameters from the number of observations in a regression model. Since, the number of parameters is one more than the number of independent variables, the degrees of freedom in this case is 10-(7 + 1) = 2.
Explanation: The degrees of freedom in a regression model is computed by subtracting the number of parameters from the number of observations in a regression model. Since, the number of parameters is one more than the number of independent variables, the degrees of freedom in this case is 10-(7 + 1) = 2.
2
If the explained sum of squares is 35 and the total sum of squares is 49, what is the residual sum of squares?
A)10
B)12
C) 18
D) 14
A)10
B)12
C) 18
D) 14
D
Explanation: The residual sum of squares is obtained by subtracting the explained sum of squares from the total sum of squares, or 49-35=14.
Explanation: The residual sum of squares is obtained by subtracting the explained sum of squares from the total sum of squares, or 49-35=14.
3
The coefficient of determination (R2) decreases when an independent variable is added to a multiple regression model.
False
Explanation: The coefficient of determination (R2) never decreases when an independent variable is added to a multiple regression model.
Explanation: The coefficient of determination (R2) never decreases when an independent variable is added to a multiple regression model.
4
Which of the following is true of BLUE?
A)It is a rule that can be applied to any one value of the data to produce an estimate.
B)An estimator
is an unbiased estimator of
if
for any
.
C) An estimator is linear if and only if it can be expressed as a linear function of the data on the dependent variable.
D) It is the best linear uniform estimator.
A)It is a rule that can be applied to any one value of the data to produce an estimate.
B)An estimator




C) An estimator is linear if and only if it can be expressed as a linear function of the data on the dependent variable.
D) It is the best linear uniform estimator.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
5
Exclusion of a relevant variable from a multiple linear regression model leads to the problem of _____.
A)misspecification of the model
B)multicollinearity
C) perfect collinearity
D) homoskedasticity
A)misspecification of the model
B)multicollinearity
C) perfect collinearity
D) homoskedasticity
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
6
The Gauss-Markov theorem will not hold if _____.
A)the error term has the same variance given any values of the explanatory variables
B)the error term has an expected value of zero given any values of the independent variables
C) the independent variables have exact linear relationships among them
D) the regression model relies on the method of random sampling for collection of data
A)the error term has the same variance given any values of the explanatory variables
B)the error term has an expected value of zero given any values of the independent variables
C) the independent variables have exact linear relationships among them
D) the regression model relies on the method of random sampling for collection of data
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
7
Consider the following regression equation:
. What does
imply?
A)
measures the ceteris paribus effect of
on
.
B)
measures the ceteris paribus effect of
on
.
C)
measures the ceteris paribus effect of
on
.
D)
measures the ceteris paribus effect of
on
.


A)



B)



C)



D)



Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
8
The term "linear" in a multiple linear regression model means that the equation is linear in parameters.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
9
Which of the following is true of R2?
A)R2 is also called the standard error of regression.
B)A low R2 indicates that the Ordinary Least Squares line fits the data well.
C) R2 usually decreases with an increase in the number of independent variables in a regression.
D) R2 shows what percentage of the total variation in the dependent variable, Y, is explained by the explanatory variables.
A)R2 is also called the standard error of regression.
B)A low R2 indicates that the Ordinary Least Squares line fits the data well.
C) R2 usually decreases with an increase in the number of independent variables in a regression.
D) R2 shows what percentage of the total variation in the dependent variable, Y, is explained by the explanatory variables.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
10
High (but not perfect) correlation between two or more independent variables is called _____.
A)heteroskedasticty
B)homoskedasticty
C) multicollinearity
D) micronumerosity
A)heteroskedasticty
B)homoskedasticty
C) multicollinearity
D) micronumerosity
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
11
If an independent variable in a multiple linear regression model is an exact linear combination of other independent variables, the model suffers from the problem of _____.
A)perfect collinearity
B)homoskedasticity
C) heteroskedasticty
D) omitted variable bias
A)perfect collinearity
B)homoskedasticity
C) heteroskedasticty
D) omitted variable bias
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
12
The assumption that there are no exact linear relationships among the independent variables in a multiple linear regression model fails if _____, where n is the sample size and k is the number of parameters.
A)n > 2
B)n = k + 1
C) n > k
D) n < k + 1
A)n > 2
B)n = k + 1
C) n > k
D) n < k + 1
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
13
In econometrics, the general partialling out result is usually called the _____.
A)Gauss-Markov assumption
B)Best linear unbiased estimator
C) Frisch-Waugh theorem
D) Gauss-Markov theorem
A)Gauss-Markov assumption
B)Best linear unbiased estimator
C) Frisch-Waugh theorem
D) Gauss-Markov theorem
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
14
Suppose the variable x2 has been omitted from the following regression equation,
is the estimator obtained when
is omitted from the equation. The bias in
is positive if _____.
A)
>0 and
and
are positively correlated
B)
<0 and
and
are positively correlated
C)
>0 and
and
are negatively correlated
D)
= 0 and
and
are negatively correlated




A)



B)



C)



D)



Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
15
In the equation,
is a(n) _____.
A)independent variable
B)dependent variable
C) slope parameter
D) intercept parameter

A)independent variable
B)dependent variable
C) slope parameter
D) intercept parameter
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
16
Suppose the variable
has been omitted from the following regression equation,
is the estimator obtained when
is omitted from the equation. The bias in
is negative if _____.
A)
>0 and
and
are positively correlated
B)
<0 and
and
are positively correlated
C)
=0 and
and
are negatively correlated
D)
=0 and
and
are negatively correlated





A)



B)



C)



D)



Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
17
The term _____ refers to the problem of small sample size.
A)micronumerosity
B)multicollinearity
C) homoskedasticity
D) heteroskedasticity
A)micronumerosity
B)multicollinearity
C) homoskedasticity
D) heteroskedasticity
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
18
The key assumption for the general multiple regression model is that all factors in the unobserved error term be correlated with the explanatory variables.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
19
Suppose the variable
has been omitted from the following regression equation,
is the estimator obtained when
is omitted from the equation. If
is said to _____.
A)have an upward bias
B)have a downward bias
C) be unbiased
D) be biased toward zero





A)have an upward bias
B)have a downward bias
C) be unbiased
D) be biased toward zero
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
20
The value of R2 always _____.
A)lies below 0
B)lies above 1
C) lies between 0 and 1
D) lies between 1 and 1.5
A)lies below 0
B)lies above 1
C) lies between 0 and 1
D) lies between 1 and 1.5
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
21
An explanatory variable is said to be exogenous if it is correlated with the error term.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
22
If two regressions use different sets of observations, then we can tell how the R-squareds will compare, even if one regression uses a subset of regressors.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
23
When one randomly samples from a population, the total sample variation in xj decreases without bound as the sample size increases.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck
24
A larger error variance makes it difficult to estimate the partial effect of any of the independent variables on the dependent variable.
Unlock Deck
Unlock for access to all 24 flashcards in this deck.
Unlock Deck
k this deck