Discovering Statistics

A den for Learning


2D-Random Variables

Some Important Theorems

  • A necessary and sufficient condition for X and Y to be independent is that
F(x,y) = H(x)*G(y) \forall x,y

;where F(.) is the joint d.f. and H(.) and G(,) are the marginal distribution function of X and Y respectively.

  • A necessary and sufficient condition for the independence of the discrete variables X and Y is that
p_{ij}=p_{i0}*p_{0j} \quad \forall i,j
  • The distribution function F(.) of an absolutely continuous two-dimensional random variable (X,Y) is uniquely determined by its probability-density function. Conversely, the probability density function is uniquely determined by the distribution function except perhaps for a set of Lebesgue measure zero.
  • A necessary and sufficient condition for the independence of the absolutely continuous variable X and Y is that
f(x,y)=h(x)*g(y) \forall x,y
  • If var(X) and var(Y) exists, then covers(X,Y) also exists and
[cov(X,Y)]^2 \leq var(X)var(Y)
  • If either X=c almost everywhere or Y=d almost everywhere (c and d being constants), then
cov(X,Y)=0
  • If U=a+bX and V=c+dY, and if cov(X,Y) exists, then cov(U,V) also exists and
cov(U,V) = bd * cov(X,Y)
  • If \mu_X,\mu_Y as well as \mu'_{11} exist, then cov(X,Y) too exists, and
\mu'_{11} = \mu_X\mu_Y
  • Suppose var(X) and var(Y) both exist, then var(X+Y) also exists and
var(X+Y) = var(X) +var(Y) + 2cov(X,Y)
  • If X and Y are independent, then
E(Y/X=x)=E(Y) \quad and \quad var(X/Y=y)=var(X)
  • If E(Y/X=x) exists for almost all values of X then,
E(Y) = E_X E_Y \left[ Y/X  \right]
  • If E(Y/X) and var(Y/X) exist for almost all values of X, then
var(Y) = E (var(Y/X)) + var (E(Y/X))
  • The correlation coefficient necessarily satisfies the inequality
-1 \leq \rho \leq 1
  • The correlation takes values -1 or 1 iff X and Y are linearly related almost everywhere.
  • If U=a+bX and V=c+dY, then \rho_{UV} = \pm \rho_{XY} , the sign depending upon whether b and d are of the same sign or of opposite signs.
  • If X and Y are independent random variables, and if \rho is defined, then
\rho=0
  • If the regression of Y on X is linear and if \sigma_X^2,\sigma_Y^2,\rho exist, then the constants \alpha, \beta are given by
\alpha = \mu_Y - \rho \frac{\sigma_Y}{\sigma_X} \mu_X , \beta = \rho \frac{\sigma_Y}{\sigma_X}
  • If the regression of Y on X is linear and var(Y/X=x) is algebraically independent of x, then
var(Y/X=x) = \sigma_Y^2 (1-\rho^2)

Pages: 1 2 3 4 5

Leave a Reply

%d