Introduction to Numerical Methods/Numerical Differentiation

=Numerical Differentiation= Objectives:
 * explain the definitions of forward, backward, and center divided methods for numerical differentiation
 * find approximate values of the first derivative of continuous functions
 * reason about the accuracy of the numbers
 * find approximate values of the first derivative of discrete functions (given at discrete data points)

Resources
 * numpy
 * Numerical Differentiation

Forward Divided Difference Method
$$f^{'}(x) = \frac{f(x+h)-f(x)}{h} + O(h)$$

The following code implements this method:

Backward Divided Difference Method
$$f^{'}(x) = \frac{f(x)-f(x-h)}{h} + O(h)$$

The following code implements this method:

Center Divided Difference Method
$$f^{'}(x) = \frac{f(x+h)-f(x-h)}{2h} + O(h^{2})$$

The following code implements this method:

Second Derivative
$$f^{''}(x) = \frac{f(x+h) - 2 f(x) + f(x-h)}{h^2} + O(h^2)$$

Taylor Series
Taylor series allows us to taylor expand a function into an infinite series. If the function is infinitely differentiable at number h, we can use the Taylor series to approximate the function. We can derive the backward, the forward, and the center divided difference methods using Taylor series, which also give the quantitative estimate of the error in the approximation.

$$ f(x+h) = f(x) + f^{'}(x)h + \frac{f^{(2)}(x)}{2!}h^{2}+\frac{f^{(3)}(x)}{3!}h^{3}+ \cdots \quad $$

For instance the $$sin(x)$$ function can be approximated by the truncated Taylor series:

$$\sin\left( x \right) \approx x - \frac{x^3}{3!} + \frac{x^5}{5!} - \frac{x^7}{7!}\!$$



Effect of Step Size
The following is a program that calculates the first derivative of $$e^x$$ at x=0.5 from the center divided difference formula using different values of h. The result shows that the approximation becomes more accurate (more significant digits) as step size becomes smaller but the when the step size becomes too small the rounding off error become significant.