UMD Probability Qualifying Exams/Jan2011Probability

Solution
(i): Define the person's game as the infinite sequence $$\omega=\{\omega_1,\omega_2,...\}$$ where each $$\omega_k$$ equals either 1 (corresponding to a win) or 0 (corresponding to a loss).

Define the random variable $$\tau_k:\Omega\to\mathbb{N}$$ by

$$\tau_k(\omega)=\{ \# \text{ times } \omega_j=\omega_{j-1}=1, \,\forall 2\leq j\leq k\}$$ that is, $$\tau_k$$ counts how many times the player received two consecutive wins in his first $$k$$ games. Thus, the player will win $$\tau_k$$ dollars in the first $$k$$ games. Clearly, $$\tau_k$$ is measurable. Moreover, we can compute the expectation:

$$E[\tau_k(\omega)]=\sum_{j=2}^k \frac{1}{\sqrt{j}} \frac{1}{\sqrt{j-1}}.$$

Now observe what happens as we send $$k\to\infty$$:

$$ E[\lim_{k\to\infty}\tau_k(\omega)]=\lim_{k\to\infty}\sum_{j=2}^k \frac{1}{\sqrt{j}} \frac{1}{\sqrt{j-1}}=\infty $$

Hence the expected winnings of the infinite game is also infinite. This implies that the player will surpass $$$A$$ in winnings almost surely.

(ii): Define everything as before except this time $$\tau_k(\omega)=\{ \# \text{ times } \omega_j=\omega_{j-1}=\omega_{j-2}=1, \,\forall 3\leq j\leq k\}.$$

Then $$E[\tau_k(\omega)]=\sum_{j=2}^k \frac{1}{\sqrt{j}} \frac{1}{\sqrt{j-1}}\frac{1}{\sqrt{j-2}}.$$ which gives $$E\left[\lim_{k\to\infty}\tau_k(\omega)\right]<\infty.$$ Thus we cannot assert that the probability of surpassing any given winnings will equal 1.

Solution
This is just a direct application of Bayes' theorem. Let $$N$$ denote the event that you pulled a normal coin. Let $$T$$ denote the even that you have a tail.

By Bayes,

$$P(N|T)=\frac{P(N\cap T)}{P(T)}=\frac{5/20}{13/20}=5/13.$$

The probability of seeing a tail on a normal coin, $$P(N\cap T)$$ is 5/20 since there are five tails on normal coins out of all 20 faces. The probability of seeing a tail is 13 out of 20 (5 normal + 2*4 double).

Solution
(i) Let $$P$$ be the Markov transition matrix. I claim that for any initial probability distribution, $$\mu$$, then $$E[\mu]\leq E[\mu P]$$.

Proof of claim: It is sufficient to consider the case where the initial distribution is singular, i.e. $$\mu=\chi_n$$. Clearly we can see that $$n=E[\chi_n]$$. Then $$E[\chi_n P]=2$$ if $$n=1$$ and for $$n\geq 2$$ we have $$E[\chi_n P]=1/2 (n-1)+1/2(n^2)\geq n$$.

Now let $$f(n)=1/n$$. We want to compute $$E[f(X_t)-f(X_s)|\mathcal{F}_s]$$ for $$t>s$$.

$$E[f(X_t)-f(X_s)|\mathcal{F}_s]=E[f(X_s P^{t-s})]-E[f(X_s)] =E[1/(X_s P^{t-s})]-E[1/(X_s)] \geq 0 $$ where the last inequality comes from our claim above. This shows that $$f(X_s)$$ is a supermartingale.

Solution
(i) Notice that

$$\left|\sum_{n=1}^\infty e^{-n} \xi_n\right| \leq \sum_{n=1}^\infty e^{-n}=\frac{1}{1-e}.$$ So the series is bounded. Moreover, it must be Cauchy. Indeed for any $$\epsilon>0$$ we can select $$N$$ sufficiently large so that for every $$n,m>N$$, $$\sum_{k=n}^m e^{-k} <\epsilon.$$ Hence, the series $$\sum_{n=1}^\infty e^{-n} \xi_n$$ converges almost surely.

(ii) To show that $$\xi$$ is supported on a set of Lebesgue measure zero, first recall some facts about the Cantor set.

The Cantor set $$C$$ is the set of all $$x\in[0,1]$$ with ternary expansion $$x=.a_1a_2\cdots,\, a_j=0 \text{ or }2$$ (in base 3). This corresponds to the usual Cantor set which can be thought of the perfect symmetric set with contraction 1/3.

Instead, consider the set $$E$$ consisting of all $$x\in[0,1]$$ with expansion $$x=.a_1a_2\cdots,\, a_j=0 \text{ or }2$$ in base $$e$$. There exists an obvious bijection between the elements of $$E$$ and $$\xi$$. Since the Lebesgue measure of $$E$$ is $$\lim_{n\to\infty}\left(\frac{2}{e}\right)^n=0$$. Hence $$\xi$$ has support on a set of Lebesgue measure zero.

Solution
This is a direct application of Central Limit Theorem, Lindeberg Condition.

We know that each random variable $$\xi_i$$ has mean $$i^2/2$$ and variance $$i^4/12$$.

Then $$a_n=\sum_{i=1}^n i^2/2=\frac{n(n+1)(2n+1)}{12}$$ and $$b_n^2=\sum_{i=1}^n i^4/12$$. Then $$\left(\sum_{i=1}^n \xi_i-a_n\right)/b_n$$ converges in distribution to the standard normal provided the Lindeberg condition holds.

Hence we want to check $$\lim_{n\to\infty} \frac{1}{b_n^2} \sum_{i=1}^n \int_{\{x:|x-m_i|\geq \epsilon b_n\}} (x-m_i)^2 \, dF_i(x)=0$$

Since $$b_n$$ grows faster than $$n^2$$ then for sufficiently large $$n$$, the domain of each integral is empty. Hence the above equation goes to 0 as $$n\to\infty$$. Thus the Lindeberg condition is satisfied and CLT holds.

Solution
(i) Let $$A=\left\{\omega\in\Omega\mid\lim_{t\to 0} X_t(\omega)=X(\omega)\right\}$$. By assumption $$P(A)=1$$. Now we compute the $$L^2$$ norm:

$$\lim_{t\to 0} E|X_t-X|^2=\lim_{t\to 0} \int_A |X_t-X|^2\,d\omega +\lim_{t\to 0} \int_{\Omega\setminus A} |X_t-X|^2\,d\omega.$$

Let us evaluate the first integral on the right-hand side. We can write $$\lim_{t\to 0} \int_A |X_t-X|^2\,d\omega= \lim_{t\to 0} \int_A |X_t|^2\,d\omega+ \int_A |X|^2\,d\omega-2\lim_{t\to 0} \int_A |XX_t|^2\,d\omega$$

$$\quad \leq \lim_{t\to 0} E|X_t|^2 +E|X|^2 -2\int_A \lim_{t\to 0}|XX_t|^2\,d\omega$$  by Fatou's lemma

$$=E|X|^2+E|X|^2-2E|X|^2=0 $$ (since $$E|X_t|^2=E|X|^2\, \forall t$$).

Now the second term:

$$\lim_{t\to 0} \int_{\Omega\setminus A} |X_t-X|^2\,d\omega \leq \int_{\Omega\setminus A} |X|^2\,d\omega+\lim_{t\to 0} \int_{\Omega\setminus A} |X_t|^2\,d\omega$$  by the triangle inequality.

$$\leq (E|X|^2+E|X|^2) P(\Omega\setminus A) =0$$ since $$X,X_t$$ all have finite second moments.

Thus we have just shown that under the above assumptions, almost sure convergence implies convergence in mean square.

(ii)