Engineering Analysis/Expectation and Entropy

Expectation
The expectation operator of a random variable is defined as:


 * $$E[x] = \int_{-\infty}^\infty x f_X(x)dx$$

This operator is very useful, and we can use it to derive the moments of the random variable.

Moments
A moment is a value that contains some information about the random variable. The n-moment of a random variable is defined as:


 * $$E[x^n] = \int_{-\infty}^\infty x^n f_X(x)dx$$

Mean
The mean value, or the "average value" of a random variable is defined as the first moment of the random variable:


 * $$E[x] = \mu_X = \int_{-\infty}^\infty x f_X(x)dx$$

We will use the Greek letter &mu; to denote the mean of a random variable.

Central Moments
A central moment is similar to a moment, but it is also dependent on the mean of the random variable:


 * $$E[(x - \mu_X)^n] = \int_{-\infty}^\infty (x - \mu_X)^n f_X(x)dx$$

The first central moment is always zero.

Variance
The variance of a random variable is defined as the second central moment:


 * $$E[(x - \mu_X)^2] = \sigma^2$$

The square-root of the variance, &sigma;, is known as the standard-deviation of the random variable

Mean and Variance
the mean and variance of a random variable can be related by:


 * $$\sigma^2 = \mu^2 + E[x^2]$$

This is an important function, and we will use it later.

Entropy
the entropy of a random variable $$ X $$ is defined as:


 * $$H[X]= E \left[ \frac{1}{p(X)} \right]$$