UMD Probability Qualifying Exams/Aug2009Probability

Solution
(a) $$ \begin{align} P[X_1>a]=&\int_{X_1>a} 1\, dF =\int_{X_1>a} \frac{\exp(ta)}{\exp(ta)}\, dF \\ =& e^{-at}\int_{X_1>a}e^{at}\,dF \leq e^{-at}\int_{X_1>a}e^{X_1t}\,dF\\ \leq& e^{-at}\int_{\Omega}e^{X_1t}\,dF=e^{-at}E[\exp(tX_1)] \end{align}$$

Thus far, we have not imposed any conditions on $$t$$. So the above inequality will hold for all $$t$$, hence for the supremum as well, which gives us the desired result.

(b) $$ \begin{align} P[\tilde{X}_n>a]=&\int_{\tilde{X}_n>a} 1\, dF =\int_{\tilde{X}_n>a} \frac{\exp(nta)}{\exp(nta)}\, dF \\ =& e^{-nat}\int_{\sum_{i=1}^nX_i>an}e^{nat}\,dF \leq e^{-nat}\int_{\sum_{i=1}^nX_i>an}e^{\sum_{i=1}^nX_it}\,dF\\ \leq & e^{-nat}\int_{\Omega}e^{\sum_{i=1}^nX_it}\,dF=e^{-nat}(\int_{\Omega}e^{X_it}\,dF)^n \end{align}$$ where the last equality follows from the fact that the $$X_i$$ are independent and identically distributed.

(c)

Show $$Z$$ is exponentially distributed
Let $$\tau$$ be the first time that a Poisson process $$N(t)$$ jumps.

$$\begin{align} p_\tau(x)=\lim_{\epsilon\to 0}\frac{F_\tau(x)-F_\tau(x-\epsilon)}{\epsilon}&=\lim_{\epsilon\to 0} \frac{P(N(x)>0 \cap N(x-\epsilon)=0)}{\epsilon}\\ =&\lim_{\epsilon\to 0} 1/\epsilon\, P(N(x-\epsilon)=0) \cdot P(N(x)-N(x-\epsilon)>0)\\ =&\lim_{\epsilon\to 0} 1/\epsilon\, e^{-\lambda (x-\epsilon)}\cdot \frac{\lambda\epsilon}{1} e^{-\lambda \epsilon}=\lambda e^{-\lambda x}\end{align}$$

$$N_1(t)+N_2(t)$$ is a Poisson Process with parameter $$\lambda_1+\lambda_2$$
Proof: There are three conditions to check:

(i) $$N_1(0)+N_2(0)=0$$ almost surely

(ii) For $$t>s$$ is $$(N_1(t)+N_2(t)-N_1(s)-N_2(s)$$ independent of $$N_1(s)+N_2(s)$$?  This is true since both $$N_1,N_2$$ are Poisson Processes and are both independent of each other.

(iii) For $$t>s$$ is $$(N_1(t)+N_2(t)-N_1(s)-N_2(s)$$ distributed Poisson with parameter $$t-s$$? This is true since the sum of independent Poisson processes are also poison.  (see second bullet)

Joint distribution of (J,Z)
$$P(J=1,Z=x)=\lambda_1 e^{-\lambda_1 x}$$

$$P(J=2,Z=x)=\lambda_2 e^{-\lambda_2 x}$$

Solution
Define $$Z_k:=X_k/k^\gamma$$. Then $$E[Z_k]=0$$ and $$V[Z_k]=\frac{\sigma^2}{k^{2\gamma}}$$. We check the three components of Kolmogorov's three-series theorem to conclude that $$\sum_{k=1}^\infty Z_k$$ converges almost surely.