2 minute read

1. Expectation Operator RulesPermalink

E[aX]=aE[X]E[X+c]=E[X]+cE[X+Y]=E[X]+E[Y],X and Y may be independent or notE[XY]=E[X]E[X],only if X and Y are independent

Here proves the last rule:

Suppose the joint pdf of X and Y is j(x,y), then

E[XY]=xyj(x,y)dxdy

If X and Y are independent, then by definition j(x,y)=f(x)g(y) where f and g are the marginal PDFs for X and Y. Then

E[XY]=xyj(x,y)dxdy=xyf(x)g(y)dydx=[xf(x)dx][yg(y)dy]=E[X]E[Y]

2. Covariance AgainPermalink

Cov(X,Y)=E[(XE[X])(YE[Y])]=E[XYXE[Y]E[X]Y+E[X]E[Y]]=E[XY]E[X]E[Y]E[X]E[Y]+E[X]E[Y]=E[XY]E[X]E[Y]=i=1n(xix¯)(yiy¯)n1

If X and Y are independent, Cov(X,Y)=0

3. Standard Error of the MeanPermalink

If X1,X2,,Xn are n independent observations from a population that has a mean μ and standard deviation σ, X¯=1nnxi is itself a random variable, and satisfy

  • E[X¯]=μ
  • Var(X¯)=σ2n

The standard error of the mean (SEM) is the standard deviation of the sample-mean’s estimate of a population mean, i.e.

SEx¯ =Var(X¯)=σn

4. Proof of Var(X¯)=σ2nPermalink

4.1 Proof IPermalink

Suppose T=(X1+X2++Xn), then

Var(T)=E[T2]E[T]2=Var(X1+X2++Xn)=nVar(X)=nσ2Var(X¯)=Var(Tn)=E[(Tn)2]E[Tn]2=1n2(E[T2]E[T]2)=1n2nσ2=σ2n

4.2 Proof IIPermalink

Suppose T=(X1+X2++Xn), then

E[T2]=E[i=1n(Xi2)+i,j=1,...,nij(XiXj)]=nE[X2]+(n2n)E[X]2, for Xi,Xj are independent ,E[XiXj]=E[Xi]E[Xj]=nE[X2]+(n2n)μ2E[X2]=σ2+E[X]2=σ2+μ2E[T2]=nσ2+nμ2+(n2n)μ2=nσ2+n2μ2Var(X¯)=1n2(E[T2]E[T]2)=1n2(nσ2+n2μ2n2μ2)=σ2n

Comments