Expectation

Definition (Expectation of R.V.)

Theorem (Monotonicity): $X$ and $Y$ are r.v.s. such that $X>Y$ with probability 1.
Then $E(X)\geq E(Y)$

Theorem (Expectation via Survial Function): Let $X$ be a nonnegative r.v. Let $F$ be the CDF of $X$, and define survial function of $X$ named $G$ as $G(x) = 1-F(x) = P(X>x)$, Then

Theorem (Low Of The Unconscious Statistician(LOTUS)): If $X$ is discrete r.v. and $g$ is a function from $R$ to $R$, then

Propertise of Expectation

  • $E(X+Y) = E(X) + E(Y)$
  • $E(cX) = c E(x)$
  • If $X$ and $Y$ are independent, $E(XY) = E(X) E(Y)$

Inequalities of Expectation


Variance

Definition (Variance and Standard Deviation) variance of an r.v. $X$ is

Square root of the variance is standard deviation (SD):

Propertise of Variance

  • For any r.v. $X$, $Var(X) = E(X^2) - (EX)^2$
  • $Var(X + c ) = Var(X)$
  • $Var(c X ) = c^2Var(X)$
  • If $X$ and $Y$ are independent, then $Var(X+Y) = Var(X) + Var(Y)$

Geometric and Negative Binomial

Definition (Geometric Distribution): Consider a sequence of independent Bernoulli trials, each with the same success probability $p\in (0,1)$, trails performed until a success occurs. Let $X$ be the number of the failures before the first successful trail. Then $X$ has the Geometric Distributions, denote by $X\sim Geom(p)$

Theorem (Geometric PMF): If $X\sim Geom(p)$, then the PMF of $X$ is

Theorem (Memoryless Property): If $X\sim Geom(p)$, then for positive integer n

Definition (First Success Distribution): very similay to Geometric $X$, Let it be $Y$, and $X+1 = Y$ …. , we denote it by $FS(p)$

Definition (Negative Binomial Distribution): In a sequence of independent Bernoulli trails with p, if $X$ is the number of failures before $r^{th}$ success, then $X$ is the Negative Binomial Distribution with $r$ and $p$, denoted by $X\sim NBin(r, p)$

Theorem (Negative Binomial PMF): If $X\sim NBin(r,p)$, then the PMF of $X$ is

Theorem (Geometric & Negative Binomial): Let $X\sim NBin(r,p)$, and $X_i$ are $i.i.d. Geom(p)$ , Then we have $X= X_1+\dotsb + X_r$


Indicator R.V.

Definition (Indicator R.V.)

Propertise of Indicator R.V.

  • $(I_A)^k = I_A$
  • $I_{A^c} = 1- I_A$
  • $I_{A\cap B} = I_A I_B$
  • $I_{A\cup B} = I_A + I_B - I_A I_B$

Theorem (Bridge between Probability & Expectation)

Example 1: Au urn contain R G B three balls, r g b is probability draw a ball from it (r+g+b = 1), whats the expected number of different colors of ball before getting the first R ball ?
Let $I_g$ be the $1$ if G is obtained before R, and define the $I_b$ similarly. Then

since “green before red” means that first non-blue ball is green , so probability is $frac{g}{g+r}$, then, the final result is

Moments & Indicators

Given n events $A_1,\dotsb, A_n$ and indicators $I_j, j = 1, \dotsb, n$

  • $X = \sum_{j=1}^n I_j$: the number of events occur
  • : the number of pairs of events that occur
  • .
    • $E(X^2) = 2\sum_{i<j} P(A_iA_j) + E(X)$
    • $Var(X) = 2\sum_{i<j} P(A_iA_j) + E(X) - (E(X))^2$

Poisson

Definition (Poisson Distribution) $X\sim Pois(\lambda)$

Property of poisson

  • $E(X) = \lambda$
  • $E(X^2) = \lambda(1+\lambda)$
  • $Var(X) = \lambda$

Poisson Approximation: Let $A_1,A_2,\dotsb,A_n$ be events with $p_j = P(A_j)$, here $n$ is larger where $p_j$ is small , and $A_j$ is independent or weakly dependent, let count how many $A_j$ occur. Then $X$ is approximately $Pois(\lambda)$, with $\lambda = \sum_{j=1}^n p_j$

Theorem(Sum of Independent Poisson): If $X\sim Pois(\lambda_1), Y\sim Pois(\lambda_2)$, and $X$ is independent of $Y$, then $X+Y \sim Pois(\lambda_1 + \lambda_2)$

Theorem(Poisson Approximation to Binomial): If $X\sim Bin(n,p)$ and we let $n \rightarrow \infty and p\rightarrow 0$ , such that $\lambda = np$, then the PMF of $X$ converges to the $Pois(\lambda)$