Discrete Multivariate R.V.s

Definition (Joint CDF) The Joint CDf of r.v.s X and Y is the function FX,Y given by

FX,Y(x,y)=P(Xx,Yy)

Definition (Joint PMF) The Joint PMF of discrete r.v.s X and Y is the function pX,Y given by

pX,Y(x,y)=P(X=x,Y=y)

Definition (Marginal PMF) For discrete r.v.s X and Y, Marginal PMF of X is

P(X=x)=yP(X=x,Y=y)

Definition (Conditional PMF) For discrete r.v.s X and Y, the Conditional PMF of X given Y=y is

PX|Y(x|y)=P(X=x|Y=y)=P(X=x,Y=y)P(Y=y)

Definition (Independence of Discrete R.V.s) Random variables X and Y are independent if for all x and y

FX,Y(x,y)=FX(x)FY(y)

for all x and y also equivalent to the condition

P(Y=y|X=x)=P(Y=y)

Continuous Multivariate R.V.s

Definition (Joint PDF) If X and Y are continuous with joint CDF FX,Y then

fX,Y(x,y)=2xyFX,Y(x,y)

Definition (Marginal PDF) If X and Y are continuous with joint PDF fX,Y then

fX(x)=fX,Y(x,y)dy

Definition (Conditional PDF) For continuous r.v.s. X and Y with joint PDF fX,Y the Conditional PDF of Y given X=x is

fY|X(y|x)=fX,Y(x,y)fX(x)

Definition (Independence of Continuous R.V.s) Random variables X and Y are independent if for all x and y

FX,Y(x,y)=FX(x)FY(y)

If X and Y are continuous with joint PDF fX,Y

fX,Y(x,y)=fX(x)fY(y)

Theorem (2D LOTUS) Let g be a function from R2 to R

If X and Y are discrete

E(g(X,Y))=xyg(x,y)P(X=x,Y=y)

If X and Y are continuous

E(g(X,Y))=g(x,y)fX,Y(x,y)dxdy

General Bayes’ Rule


Convariance and Correlation

Covariance

  • Measure a tendency of two r.v.s X&Y to go up or down together
  • Positive Covariance: X go up, Y tends go up
  • Negative Covariance: X go up, Y tends go down

Definition (Covariance) The covariance between r.v.s X and Y is

Cov(X,Y)=E((XEX)(YEY))=E(XY)E(X)E(Y)

Theorem (Uncorrelated) If X and Y are independent, then they are Uncorrelated(Cov(X,Y)=0)

Properties of Covariance

  • Cov(X,X)=Var(X)
  • Cov(X,Y)=Cov(Y,X)
  • Cov(X,c)=0
  • Cov(aX,Y)=aCov(X,Y)
  • Cov(X+Y,Z)=Cov(X,Z)+Cov(Y,Z)
  • Cov(X+Y,W+Z)=Cov(X,Z)+Cov(X,W)+Cov(Y,Z)+Cov(Y,W)
  • Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)
  • For n r.v.s X1,,Xn Var(X1++Xn)=Var(Xa)++Var(Xn)+2i<jCov(Xi,Yj)

Definition (Correlation) The Correlation between r.v.s X and Y is

Corr(X,Y)=Cov(X,Y)Var(X)Var(Y)

Shifting and Scaling X and Y has no effect on correlation

Theorem (Correlation Bounds) For any r.v.s X and Y

1Corr(X,Y)1

Change of Variables

Theorem (Change of Variables in One Dimension) Let X be a continuous r.v. with PDF fX, and let Y=g(X), where g is differentiable and strictly increasing. Then the PDF of Y is given by

fY(y)=fX(x)|dxdy|

where x=g1(y)

Proof:

FY(y)=P(Yy)=P(g(X)y)=P(Xg1(y))=FX(g1(y))=FX(x)

Then result obtained By the chain rule

Theorem (Change of Variables) Let X=(X1,,Xn) be a continuous random vector with joint PDF fX(x) and Y=g(X), g is an invertible function from Rn to Rn then xy form a Jacobian matrix

xy=(x1y1x1y2x1ynxny1xny2xnyn)

Then the joint PDF of Y is

fY(y)=fX(x)|xy|

Convolutions

Theorem (Convolution Sums and Integrals)
If X and Y are independent discrete r.v.s, then the PMF of their sum T=X+Y is

P(T=t)=xP(Y=tx)P(X=x)=yP(X=ty)P(Y=y)

If X and Y are independent continuous r.v.s, then the PMF of their sum T=X+Y is

fT(t)=fY(tx)fX(x)dx=fX(ty)fY(y)dy

Order Statistics

Definition (Order Statistics) For r.v.s X1,X2,,Xn the order statistics sre the random variables X(1),,X(2), where

  • X(1)=min(X1,,Xn)
  • X(2) is the 2nd of X1,,Xn
  • Xn=max(X1,,Xn)

The order statistics are dependent, for example , if X(1)=100, then X(n) is forced to be 100

We foucs on the case X1,,Xn are i.i.d continuous r.v.s, with CDF F and PDF f

Theorem (CDF of Order Statistics) Let X1,,Xn be i.i.d continuous r.v.s with CDF F, Then the CDF of the jth order statistic X(j) is

P(X(j)x)=k=jn(nk)F(x)k(1F(x))nk

Proof:


Let’s start with a specical case when j=n,X(n)=max(X1,,Xn):

FX(n)(x)=P[max(X1,...,Xn)x]=P(X1x)P(Xnx)=[F(x)]n

Then, consider another special case when j=1,X(1)=min(X1,,Xn):

FX(1)(x)=P[min(X1,...,Xn)x]=1P(X1>x)P(Xn>x)=1[1F(x)]n

The result here can be rewrite as k=1n(nk)F(x)k(1F(x))nk

This result can be obtained by expand [F(x)+1F(x)]n


Finally, let’s consider more general case where 1<j<n,X(j)x, this means at least j of {Xi} fall to the left of x

Denote N as the nunber of Xi landing to the left of x. Xi lands to the left of x w.p. P(Xix)=F(x). Then NBin(n,F(x))

P(X(j)x)=P(Nj=k=jn(nk)F(x)k(1F(x))nk

Theorem (PDF of Order Statistic) Let X1,,Xn be i.i.d. continuous r.v.s with CDF F and PDF f. Then the marginal PDF of jth order statistic X(j) is

fX(j)(x)=n(n1j1)f(x)F(x)j1(1F(x))nj

Theorem (Joint PDF) Let X1,,Xn be i.i.d. continuous r.v.s with PDF f, Then the joint PDF of all order statistics is

fX(1),...,X(n)(x1,...,xn)=n!i=1nf(xi),x1<x2<<xn

Example 1(Order Statistics of Uniforms) U1,U2,,Un are i.i.d. Unif(0,1) r.v.s with CDF F and PDF f

For 0x1 ,f(x)=1 , F(x)=x, Then

fU(j)=n(n1j1)xj1(1x)njFU(j)(x)=k=jn(nk)xk(1x)nk=0xfU(j)(t)dt=n!(j1)!(nj)!0xtj1(1t)njdt