Skip to content

Random variables and their properties

  • SSP 

Properties of  PMF:

  1. $ p(x) \geqslant 0 \forall x $
  2. $ \sum_{xin D}^{ } p(x) = 1 $

Expected value : [E[X] = $ mu $]

  1. $ E[X] = \sum x_ip(x_i)$
  2. $ E[f(x)] = \sum f(x_i)p(x_i)$
  3. $E[aX+b] = aE[X] + b$

Variance Var[x]:

  1. $ Var[X] = E[(X-\mu)^2 ] = E[(X-E[X])^2] = \sigma ^2$
  2. $ Var[X]  = E[X^2] – E^2[X] = E[X^2] – (\mu)^2$
  3. $ Var[X] = Var[E[X/Y]] + E[Var[X/Y]]$
  4. $Var[aX+b] = a^2Var[X]$
  5. $Var[aX+bY+c] = a^2Var[X] + b^2Var[Y] + 2abCov[X,Y]$
  6. If X and Y are independent random Variables: $Var[XY] = Var[X]Var[Y] + Var[X].E^{2}[Y] + Var[Y].E^{2}[X]$ – TBV

If X and Y are independent r.v’s:

  1. $f_{X,Y}(x,y) = f_X(x). f_Y(y)$
  2. $f_{X/Y}(x/y) = f_X(x)$
  3. $E[g(X). h(Y)] = E[g(X)].E[h(Y)]$ For any suitably integrable functions g and h.
  4. $M_{X+Y}(t) = M_X(t)M_Y(t)$
  5. $ Cov[X,Y]  = 0 $
  6. $Var(X+Y) = Var(X) + Var(Y) = Var(X-Y)$

Joint distributions:

  • Let X and Y be two discrete r.v’s with a joint p.m.f $f_{X,Y}(x,y) = P(X = x;Y = y)$. Note that the distributions (mass functions

– $f_{X}(x) = P(X=x)  = \sum_{y}f_{X,Y}(x,y)$ and

– $f_{Y}(y) = P(Y=y) = \sum_{x}f_{X,Y}(x,y)$

are the marginal distributions of X and Y respectively.

  • If $f_Y(y) \neq 0$ the conditional p.m.f of X/Y=y is given by

$f_{X/Y}(x/y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}$

  • $E[X/Y=y] =  \sum_{x}xf_{X/Y}(x/y)$ and more generally

$E[g(X)/Y=y] =  \sum_{x}g(x)f_{X/Y}(x/y)$

  • $Var[X/Y=y] = E[X^2/Y=y] – E^2[X/Y=y]$
  • Note that $E[X/Y]$ is a random variable while $E[X/Y=y]$ is a number.
  • $E[E[X/Y]] = E[X]$   [Applies for any two r.v’s X and Y]
  • $E[E[g(X)/Y]] = E[g(X)] $

Covariance Cov[X,Y] and Correlation co-efficient:

  1. $Cov[X,Y] = E[XY] – E[X]E[Y]$
  2. $ Cov[aX+bY+c, dZ+eW+f] = adCov[XZ] + aeCov[XW] + bdCov[YZ] + beCov[YW]$
  3. Correlation Co-efficient = $\rho (X,Y) = \frac{Cov[XY]}{\sigma_X\sigma_Y}$
  4. $-1 \leq \rho (X,Y) \leq 1 $
  5. Co-efficient of Variance = $\frac{\sigma}{\mu}$

Inequalities:

Chebychev’s:

$Var(X) \geq c^2 P(\left | X-\mu \right \| \geq c)$

 

Moment Functions:

  1. $ E[X^k] = k^{th}$ moment of X around 0
  2. $ E[(X-\mu)^k] = k^{th}$ moment of X about the mean $\mu$ = $ k^{th}$ central moment of X
  3. $E[(\frac{X-\mu}{\sigma})^3]$ = Measure of lack of symmetry  =Skewness
    • Positively skewed => $E[(\frac{X-\mu}{\sigma})^3] > 0$  => Skewed to the right
    • Negatively skewed => $E[(\frac{X-\mu}{\sigma})^3] < 0$ => Skewed to the left
    • Symmetric distribution => $E[(\frac{X-\mu}{\sigma})^3] = 0$
  4. $E[(\frac{X-\mu}{\sigma})^4]$ = Measure of peakness = Kurtosis

Moment Generating Functions:

  1. $ M(t) = E[e^{xt}] = M_x(t) = \sum e^{xt}p(x) = \int_{-\infty }^{\infty }e^{xt}f(x)dx $
  2. MGF is a function of t
  3. ${M}'(t) = E[xe^{xt}]$
  4. $M(0) = 1$
  5. ${M}'(0) = E[X]$ , ${M}”(0) = E[X^2]$, ….
  6. $M^n(t) = E[x^ne^{xt}]$
  7. If X has an MGF $M_x(t)$ and if Y = aX+b then Y will have an MGF = $M_y(t) = e^{bt}M_x(at)$

Variance and Covariance:

  1. $Var(X\pm Y) = Var(X)+Var(Y)\pm 2Cov(X,Y)$
  2. $Var(X+Y) = Var(X-Y)$ if X and Y are independent