UMD Probability Qualifying Exams/Jan2010Probability

Solution
We will show it converges to a Poisson distribution with parameter $$\lambda$$. The characteristic function for the Poisson distribution is $$e^{\lambda (e^{it}-1)}$$. We show the characteristic function, $$E[\exp(it\sum_{k=1}^{r_n} X_{nk})]$$ converges to $$e^{\lambda (e^{it}-1)}$$, which implies the result.

$$\log E[\exp(it\sum_{k=1}^{r_n} X_{nk})]=\sum_{k=1}^{r_n} \log((1-p_{nk})+p_{nk}e^{it}) =\sum_{k=1}^{r_n} \log(1-p_{nk}(1-e^{it})) = \sum_{k=1}^{r_n} (-p_{nk}(1-e^{it})+O(p_{nk}^2))$$. By our assumptions, this converges to $$\lambda (e^{it}-1)$$.

Solution
Let $$Y_n=(X_1X_2\cdots X_n)^{1/n}$$.

$$\log (Y_n) =\frac{1}{n} \sum_{j=1}^n \log(X_j)$$.

The random variables $$\log(X_j)$$ are i.i.d. with finite mean,

$$E[\log (X_j)]=\int_0^1 \log(t)dt = -1$$.

Therefore, the strong law of large numbers implies $$\frac{1}{n} \sum_{j=1}^n \log(X_j)$$ converges with probability one to $$-1$$.

So almost surely, $$\log (Y_n)$$ converges to $$-1$$ and $$Y_n$$ converges to $$e^{-1}$$.

Solution
Since $$X_n$$ is a martingale, $$|X_n|$$ is a non-negative submartingale and $$E[|X_n|^2]<\infty$$ since $$X_n$$ is square integrable. Thus $$|X_n|$$ meets the conditions for Doob's Martingale Inequality and the result follows.

Solution
$$E[(X-E[X|\mathcal{G}_1])^2]=E[((X-E[X|\mathcal{G}_2])+(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1]))^2]$$ $$=E[(X-E[X|\mathcal{G}_2])^2]+E[(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])^2] + 2E[(X-E[X|\mathcal{G}_2])(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])]$$

We will show that the third term vanishes. Then since the second term is nonnegative, the result follows.

$$E[(X-E[X|\mathcal{G}_2])(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])]=E[E[(X-E[X|\mathcal{G}_2])(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])|\mathcal{G}_2]]$$ by the law of total probability.

$$E[(X-E[X|\mathcal{G}_2])(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])|\mathcal{G}_2]=(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])E[(X-E[X|\mathcal{G}_2])|\mathcal{G}_2]$$, since $$(E[X|\mathcal{G}_2]-E[X|\mathcal{G}_1])$$ is $$\mathcal{G}_2$$-measurable.

Finally, $$E[(X-E[X|\mathcal{G}_2])|\mathcal{G}_2]=E[X|\mathcal{G}_2]-E[E[X|\mathcal{G}_2]|\mathcal{G}_2]=E[X|\mathcal{G}_2]-E[X|\mathcal{G}_2]=0$$

Solution
We show $$P[X_n=1 \text{ finitely often}]=0.$$. If $$X_n=1$$ for only finitely many $$n$$, then there is a largest index $$T$$ for which $$X_T=1$$. We show in contrast that for all $$T$$, $$P[X_n=0 \text{ for all }n\geq T]=0$$.

First notice, $$P[X_1=0] \leq (1-\alpha)$$ and $$P[X_T=0]=E[P[X_T=0|X_1,X_2,\ldots,X_{T-1}]] \leq (1-\alpha) \text{ for T}>1$$.

Then let $$A_n^{(T)}$$ be the event $$[X_{T+n-1}=\ldots=X_T=0]$$, then $$P[X_n=0 \text{ for all }n\geq T]=P[A_n^{(T)} \text{ occurs for all n}]$$.

Notice $$P[A_n^{(T)}]=P[X_{T+n-1}=0|A_{n-1}^{(T)}]P[A_{n-1}^{(T)}] \leq (1-\alpha)P[A_{n-1}^{(T)}] \text{ for n =2,3,}\ldots$$ and $$P[A_{1}^{(T)}]=P[X_T=0] \leq (1-\alpha)$$. Therefore $$P[A_n^{(T)}] \leq (1-\alpha)^n$$ and $$\lim_{n\rightarrow \infty} P[A_n^{(T)}] =0$$. So $$P[X_n=0 \text{ for all n}\geq T]=0$$ and we reach the desired conclusion.