Solved

In a Simple Regression with an Intercept and a Single (TSS=i=1n(YiYˉ)2)\left( T S S = \sum _ { i = 1 } ^ { n } \left( Y _ { i } - \bar { Y } \right) ^ { 2 } \right)

Question 64

Essay

In a simple regression with an intercept and a single explanatory variable, the variation in Y (TSS=i=1n(YiYˉ)2)\left( T S S = \sum _ { i = 1 } ^ { n } \left( Y _ { i } - \bar { Y } \right) ^ { 2 } \right) can be decomposed into the explained sums of squares (ESS=i=1n(Y^iYˉ)2)\left( E S S = \sum _ { i = 1 } ^ { n } \left( \hat { Y } _ { i } - \bar { Y } \right) ^ { 2 } \right) and the sum of squared residuals (SSR=i=1nu^i2=i=1n(YiY^)2)\left( \operatorname { SSR } = \sum _ { i = 1 } ^ { n } \hat { u } _ { i } ^ { 2 } = \sum _ { i = 1 } ^ { n } \left( Y _ { i } - \hat { Y } \right) ^ { 2 } \right) (see, for example, equation (4.35)in the textbook).
Consider any regression line, positively or negatively sloped in {X,Y} space. Draw a horizontal line where, hypothetically, you consider the sample mean of Y (=Yˉ)( = \bar { Y } ) to be. Next add a single actual observation of Y.
In this graph, indicate where you find the following distances: the
(i)residual
(ii)actual minus the mean of Y
(iii)fitted value minus the mean of Y

Correct Answer:

verifed

Verified

blured image_TB5979_11...

View Answer

Unlock this answer now
Get Access to more Verified Answers free of charge

Related Questions

Unlock this Answer For Free Now!

View this answer and more for free by performing one of the following actions

qr-code

Scan the QR code to install the App and get 2 free unlocks

upload documents

Unlock quizzes for free by uploading documents