next up previous
Next: Tests for goodness-of-fit Up: Contents Previous: Correlation

Multiple Regression

Activity 13.1   Justify the last statement.
$ \blacksquare$

Answer 13.1   The equation

$\displaystyle \sum_{i=1}^n \hat{\epsilon}_i = 0
$

may be written

$\displaystyle \sum_{i=1}^n \left[Y_i- A - \sum_{j=1}^pB_jx_{ij}\right]=0
$

which on summing through the bracket gives

$\displaystyle n\bar{Y} -nA -\sum_{i=1}^n\sum_{j=1}^pB_jx_{ij}=0
$

Interchanging the order of the sums,

$\displaystyle n\bar{Y} -nA -\sum_{j=1}^pB_j\sum_{i=1}^nx_{ij}=0
$

which gives

$\displaystyle n\bar{Y} -nA -n\sum_{j=1}^pB_j\bar{x}_j=0.
$

Dividing by $ n$ shows that

$\displaystyle \bar{Y} =A +\sum_{j=1}^pB_j\bar{x}_j.
$

The last equation is the result asked for.
$ \blacksquare$

Activity 13.2   Prove the sum of squares identity.
$ \blacksquare$

Answer 13.2   For any choice of $ \alpha, \beta_1,\dots ,\beta_p$, we have

$\displaystyle \sum_{i=1}^n(Y_i-\alpha-\beta_1 x_{i1}-\dots-\beta_p x_{ip})^2$ $\displaystyle =\sum_{i=1}^n((Y_i-\hat{Y}_i)+(\hat{Y}_i-\alpha-\beta_1 x_{i1}-\dots-\beta_p x_{ip}))^2$    
  $\displaystyle =\sum_{i=1}^n(\hat{\epsilon}_i+(\hat{Y}_i-\alpha-\beta_1 x_{i1}-\dots-\beta_p x_{ip}))^2.$    

Now we can square the two terms in the main bracket and carry through the sum. The least squares equations give immediately that the cross-product term vanishes because $ (\hat{Y}_i-\alpha-\beta_1
x_{i1}-\dots-\beta_p x_{ip})$ has terms which do not vary with $ i$ or vary with $ i$ only through the presence of $ x_{ij}$.
$ \blacksquare$

Activity 13.3   Suppose that $ n=4$ the values for $ p=3$ explanatory variables are in the columns
Variables
$ V_1$ $ V_2$ $ V_3$
1 2 3
2 3 5
3 4 7
4 5 9
Show that fitted values $ V_1+V_2+V_3$ are the same as fitted values $ 0.5V_1+0.5V_2+1.5V_3$.
$ \blacksquare$

Answer 13.3   One should just do the two calculations for each row of the table. For instance, in row 1, $ V_1+V_2+V_3=1+2+3=6=0.5+1+4.5=0.5V_1+0.5V_2+1.5V_3$. There is collinearity here, since $ V1+V2=V3$.
$ \blacksquare$

Activity 13.4   If the model fits, then the fitted values and the residuals from the model are independent of each other. What do you expect to see if the model fits when you plot residuals against fitted values?
$ \blacksquare$

Answer 13.4   If the model fits, one would expect to see a random scatter with no particular pattern.
$ \blacksquare$


next up previous
Next: Tests for goodness-of-fit Up: selftestnew Previous: Correlation
M.Knott 2002-09-12