Properties of PMF:
- $ p(x) \geqslant 0 \forall x $
- $ \sum_{xin D}^{ } p(x) = 1 $
Expected value : [E[X] = $ mu $]
- $ E[X] = \sum x_ip(x_i)$
- $ E[f(x)] = \sum f(x_i)p(x_i)$
- $E[aX+b] = aE[X] + b$
Variance Var[x]:
- $ Var[X] = E[(X-\mu)^2 ] = E[(X-E[X])^2] = \sigma ^2$
- $ Var[X] = E[X^2] – E^2[X] = E[X^2] – (\mu)^2$
- $ Var[X] = Var[E[X/Y]] + E[Var[X/Y]]$
- $Var[aX+b] = a^2Var[X]$
- $Var[aX+bY+c] = a^2Var[X] + b^2Var[Y] + 2abCov[X,Y]$
- If X and Y are independent random Variables: $Var[XY] = Var[X]Var[Y] + Var[X].E^{2}[Y] + Var[Y].E^{2}[X]$ – TBV
If X and Y are independent r.v’s:
- $f_{X,Y}(x,y) = f_X(x). f_Y(y)$
- $f_{X/Y}(x/y) = f_X(x)$
- $E[g(X). h(Y)] = E[g(X)].E[h(Y)]$ For any suitably integrable functions g and h.
- $M_{X+Y}(t) = M_X(t)M_Y(t)$
- $ Cov[X,Y] = 0 $
- $Var(X+Y) = Var(X) + Var(Y) = Var(X-Y)$
Joint distributions:
- Let X and Y be two discrete r.v’s with a joint p.m.f $f_{X,Y}(x,y) = P(X = x;Y = y)$. Note that the distributions (mass functions
– $f_{X}(x) = P(X=x) = \sum_{y}f_{X,Y}(x,y)$ and
– $f_{Y}(y) = P(Y=y) = \sum_{x}f_{X,Y}(x,y)$
are the marginal distributions of X and Y respectively.
- If $f_Y(y) \neq 0$ the conditional p.m.f of X/Y=y is given by
$f_{X/Y}(x/y) = \frac{f_{X,Y}(x,y)}{f_Y(y)}$
- $E[X/Y=y] = \sum_{x}xf_{X/Y}(x/y)$ and more generally
$E[g(X)/Y=y] = \sum_{x}g(x)f_{X/Y}(x/y)$
- $Var[X/Y=y] = E[X^2/Y=y] – E^2[X/Y=y]$
- Note that $E[X/Y]$ is a random variable while $E[X/Y=y]$ is a number.
- $E[E[X/Y]] = E[X]$ [Applies for any two r.v’s X and Y]
- $E[E[g(X)/Y]] = E[g(X)] $
Covariance Cov[X,Y] and Correlation co-efficient:
- $Cov[X,Y] = E[XY] – E[X]E[Y]$
- $ Cov[aX+bY+c, dZ+eW+f] = adCov[XZ] + aeCov[XW] + bdCov[YZ] + beCov[YW]$
- Correlation Co-efficient = $\rho (X,Y) = \frac{Cov[XY]}{\sigma_X\sigma_Y}$
- $-1 \leq \rho (X,Y) \leq 1 $
- Co-efficient of Variance = $\frac{\sigma}{\mu}$
Inequalities:
Chebychev’s:
$Var(X) \geq c^2 P(\left | X-\mu \right \| \geq c)$
Moment Functions:
- $ E[X^k] = k^{th}$ moment of X around 0
- $ E[(X-\mu)^k] = k^{th}$ moment of X about the mean $\mu$ = $ k^{th}$ central moment of X
- $E[(\frac{X-\mu}{\sigma})^3]$ = Measure of lack of symmetry =Skewness
- Positively skewed => $E[(\frac{X-\mu}{\sigma})^3] > 0$ => Skewed to the right
- Negatively skewed => $E[(\frac{X-\mu}{\sigma})^3] < 0$ => Skewed to the left
- Symmetric distribution => $E[(\frac{X-\mu}{\sigma})^3] = 0$
- $E[(\frac{X-\mu}{\sigma})^4]$ = Measure of peakness = Kurtosis
Moment Generating Functions:
- $ M(t) = E[e^{xt}] = M_x(t) = \sum e^{xt}p(x) = \int_{-\infty }^{\infty }e^{xt}f(x)dx $
- MGF is a function of t
- ${M}'(t) = E[xe^{xt}]$
- $M(0) = 1$
- ${M}'(0) = E[X]$ , ${M}”(0) = E[X^2]$, ….
- $M^n(t) = E[x^ne^{xt}]$
- If X has an MGF $M_x(t)$ and if Y = aX+b then Y will have an MGF = $M_y(t) = e^{bt}M_x(at)$
Variance and Covariance:
- $Var(X\pm Y) = Var(X)+Var(Y)\pm 2Cov(X,Y)$
- $Var(X+Y) = Var(X-Y)$ if X and Y are independent