Probability/Joint Distributions and Independence

Motivation
Suppose we are given a pmf of a discrete random variable $$X$$ and a pmf of a discrete random variable $$Y$$. For example, $$ f_X(x)=(\mathbf 1\{x=0\}+\mathbf 1\{x=1\})/2\quad\text{and}\quad f_Y(y)=(\mathbf 1\{y=0\}+\mathbf 1\{y=2\})/2 $$ We cannot tell the relationship between $$X$$ and $$Y$$ with only such information. They may be related or not related.

For example, the random variable $$X$$ may be defined as $$X=1$$ if head comes up and $$X=0$$ otherwise from tossing a fair coin, and the random variable $$Y$$ may be defined as $$Y=2$$ if head comes up and $$Y=0$$ otherwise from tossing the coin another time. In this case, $$X$$ and $$Y$$ are unrelated.

Another possibility is that the random variable $$Y$$ is defined as $$Y=2X$$ if head comes up in the first coin tossing, and $$Y=0$$ otherwise. In this case, $$X$$ and $$Y$$ are related.

Yet, in the above two examples, the pmf of $$X$$ and $$Y$$ are exactly the same.

Therefore, to tell the between $$X$$ and $$Y$$, we define the  cumulative distribution function, or joint cdf.

Joint distributions
Sometimes, we may want to know the random behaviour in one of the random variables involved in a joint cdf. We can do this by computing the marginal cdf from joint cdf. The definition of marginal cdf is as follows:

Similar to the one-variable case, we have joint pmf and joint pdf. Also, analogously, we have marginal pmf and marginal pdf.

{{colored exercise| Suppose there are two red balls and one blue ball in a box, and we draw two balls one by one from the box with replacement. Let $$X=1$$ if the ball from the first draw is red, and $$X=0$$ otherwise. Let $$Y=1$$ if the ball from the second draw is red, and $$Y=0$$ otherwise. {Calculate the marginal pmf of $$X$$. + $$f_X(x)=(\mathbf 1\{x=0\}+2\cdot\mathbf 1\{x=1\})/3$$ - $$f_X(x)=(\mathbf 1\{x=1\}+2\cdot\mathbf 1\{x=0\})/3$$ - $$f_X(x)=2/3$$ - $$f_X(x)=(\mathbf 1\{x=1\}+\mathbf 1\{x=0\})/2$$
 * type=""}
 * $$\mathbb P(X=1)=2/3$$ and $$\mathbb P(X=0)=1/3$$.

{Calculate the joint pmf of $$(X,Y)$$. + $$f(x,y)=(1/9)(\mathbf 1\{(x,y)=(0,0)\}+2\cdot\mathbf 1\{(x,y)=(0,1)\}+2\cdot\mathbf 1\{(x,y)=(1,0)\}+4\cdot\mathbf 1\{(x,y)=(1,1)\}$$) - $$f(x,y)=(1/9)(4\cdot\mathbf 1\{(x,y)=(0,0)\}+2\cdot\mathbf 1\{(x,y)=(0,1)\}+2\cdot\mathbf 1\{(x,y)=(1,0)\}+\mathbf 1\{(x,y)=(1,1)\}$$) - $$f(x,y)=(2/9)(\mathbf 1\{(x,y)=(0,0)+\mathbf 1\{(x,y)=(0,1)\}+\mathbf 1\{(x,y)=(1,0)\}+\mathbf 1\{(x,y)=(1,1)\}$$) }}
 * type=""}

For continuous random variables, the definition is generalized version of the one for continuous random variables (univariate case).

Independence
Recall that multiple events are independent if the probability for the intersection of them equals the product of probabilities of each event, by definition. Since $$\{X\in A\}$$ is also an event, we have a natural definition of independence for as follows:

Sum of independent random variables (optional)
In general, we use joint cdf, pdf or pmf to determine the distribution of sum of independent random variables by first principle. In particular, there are some interesting results related to the distribution of of  random variables.

Poisson process
There are several important properties for Poisson process.