Conditional Expectation Given An Event

If $Y$ is a discrete r.v.

If $Y$ is continuous r.v.

Approximation

Image a large number of $n$ of replication of experiments $y_1,…,y_n$

If $I_j$ is the indicator of $A$ occurring

Example (Life Expectation)

Yang is 24 years old, he hear average life expectancy is $80$, Should he conclude he has 50 years of life left ?

Of Course not, cause he already live $24$ years and some people may die less than $24$

Law of Total Expectation

Let $A_1,…,A_n$ be partition of a sample space, $Y$ be a random variable on sample space. Then

Example (Geometric Expectation Redux)

Let $X\sim Geom(p)$, as the number of Tails before the first Heads in a sequence of coin flips with p. $p$ of head. To get $E(X)$ from sum of series, it also can be obtained in another way. We condition on the outcome of the first toss: if it lands heads, then $X$ is $0$ and we’re done ; if it lands Tails, then we wasted one toss and back to where we started by memorylessness Therefore

which gives $E(X) = q/p$

Example (Time until HH vs. HT)

You toss a fair coin repeatedly. What is the expected number of tosses until the pattern HT/HH appears for the first times ?

Times until HT

  • $W_{HT}$: number of tosses untill HT appears
  • $W_1$: waiting time for first H
  • $W_2$: additional waiting time for the first T

    Then

    $W_1\sim Fs(\frac{1}{2}),E[W_1] = 2$

    $W_2\sim Fs(\frac{1}{2}),E[W_2] = 2$

Times until HH

where

and

Thus we get $E[W_{HH}] = 6$

As we can see the above example use the memorylessness property of the Conditional Expectation of distribution, and construct target in both side to calculate the target


Conditional Expectation Given An R.V.

Let $g(x) = E(Y|X=x)$ Then the conditional expectation of $Y$ given $X$, denoted $E(Y|E)$ is defined to be the random variable $g(X)$

Example (Stick Length)

Suppose we have a stick of length $1$ and break the stick at a point $X$ chosen uniformly at random. Given that $X=x$, we then choose another breakpoint $Y$ uniformly on the interval $[0,x]$, find $E(Y|X)$, and its mean and variance


Properties of Conditional Expectation

Theorem (Dropping independent)

If $X$ and $Y$ are independent, then $E(Y|X)=E(Y)$

Taking Out What’s Known

Theorem (Linearity)

Theorem (Adam’s Law)

For any r.v.s $X$ and $Y$

Proof by LOTP

For $X$ discrete

We let $E(Y|X=x) = g(x)$, then

So

Theorem (Adam’s Law with Extra Conditioning)

For any r.v.s $X,Y,Z$

Definition (Conditional Variance)

this equivalent to

Theorem (Eve’s Low)

Example (Random Sum)

A store receives $N$ customers a day, $N$ is an r.v. with finite mean and variance. Let $X_j$ be the amount spend by the $j^{th}$ customer, $X_j$ has the mean $\mu$ and variance $\sigma^2$, $N$ and $X_j$ are independent of one another. Find the random sum $X = \sum_{j=1}^N X_j$ in terms of $\mu,\sigma^2,E(N),Var(N)$

For E(X)

Finally, by Adam’s Law

For Var(X)

We conditon on $N$ get $Var(X|N)$

Eve’s Law give the unconditional variance of $X$


Prediction and Estimation

Theorem (Projection Interpretation)

For any function $h$, the r.v. $Y-E(Y|X)$ is Uncorrelated with $h(X)$: $Cov(Y-E(Y|X),h(X)) = 0$, equivalently

.