Statistics/Distributions/Continuous

A continuous statistic is a random variable that does not have any points at which there is any distinct probability that the variable will be the corresponding number.

Cumulative Distribution Function
A continuous random variable, like a discrete random variable, has a cumulative distribution function. Like the one for a discrete random variable, it also increases towards 1. Depending on the random variable, it may reach one at a finite number, or it may not. The cdf is represented by a capital F.

Probability Distribution Function
Unlike a discrete random variable, a continuous random variable has a probability density function instead of a probability mass function. The difference is that the former must integrate to 1, while the latter must have a total value of 1. The two are very similar, otherwise. The pdf is represented by a lowercase f.

Special Values
Let R be the set of points of the distribution.

The expected value for a continuous variable X with probability density function f is defined as $$\int_{R} xf(x)dx$$.

More generally, the expected value of any continuously transformed variable g(X) with probability density function f is defined as $$\int_{R} g(x)f(x)dx$$.

The mean of a continuous or discrete distribution is defined as $$E[X]$$.

The variance of a continuous or discrete distribution is defined as $$E[(X-E[X]^2)]$$.

Expectations can also be derived by producing the Moment Generating Function for the distribution in question. This is done by finding the expected value $$E[\exp(tX)]$$. Once the Moment Generating Function has been created, each derivative of the function gives a different piece of information about the distribution function.

$$\frac{d E[\exp(tX)]}{dt}$$ = mean $$\frac{d^2 E[\exp(tX)]}{dt^2}$$ = variance $$\frac{d^3 E[\exp(tX)]}{dt^3}$$ = skewness $$\frac{d^4 E[\exp(tX)]}{dt^4}$$ = kurtosis