Discovering Statistics

A den for Learning


JAM 2022 [ 51-60 ]

Consider the linear regression model y_i = \beta_0 + \beta_1 x_i + \in_i , i = 1,2,\dots, 6 where \beta_0 and \beta_1 are unknown parameters and \in_i’s are independent and identically distributed random variables having N(0,1) distribution. The data on (x_i,y_i) are given in the following table:

x_i 1.02.02.53.03.54.5
y_i 2.03.03.54.25.05.4

If \hat{\beta_0} and \hat{\beta_1} are the least squares estimates of \beta_0 and \beta_1 , respectively, based on the above data, then \hat{\beta_0} + \hat{\beta_1} equals. ______________________ (round off to 2 decimal places)

The following can be calculated from the table::

n=6 , \\ 
\sum x_i = 16.5 , \sum y_i = 23.1 , \\
\sum x_i^2 = 52.75 , \sum y_i^2 = 97.05\\
 \sum x_i y_i = 71.15 \\
var(x) = 1.623 , var(y) = 1.623 \\
cov(x,y) =1.525

Noting that the regression model may be rewritten as:

(y_i -\bar{y}) =  \beta_1 (x_i - \bar{x}) + (\in_i -\bar{\in}) , i = 1,2,\dots, 6 \\
as  \\
\bar{y} = \beta_0 + \beta_1 \bar{x} + \bar{\in} 

By using the least square principle we will have,

\hat{\beta_1} = \frac{cov(x,y)}{\sigma^2_x} = 1.034 \\
\hat{\beta_0} =1.007 \\
and \\
\hat{\beta_0} + \hat{\beta_1} = 2.041

Leave a Reply

%d bloggers like this: