Analytic Combinatorics/Cauchy-Hadamard theorem and Cauchy's inequality

Introduction
Two of the most basic means of estimating coefficients of generating functions are the Cauchy-Hadamard theorem and Cauchy's inequality.

We also include some background knowledge which will be useful for future chapters.

Limit superior
One key concept in analysis is a sequence of numbers. In our case, the sequence of numbers could be the coefficients of the generating function we are interested in, written $$\{ a_n \}$$.

A point of accumulation of a sequence is a number $$t$$ such that, given $$\epsilon > 0$$, there are infinitely many $$n$$ such that
 * $$|a_n - t| < \epsilon$$.

For example, the sequence of coefficients of $$\frac{1}{1 - z}$$ ($$\{1, 1, 1, \cdots \}$$) has point of accumulation $$1$$. $$e^z$$ ($$\{ 1, \frac{1}{2!}, \frac{1}{3!}, \cdots \}$$) has point of accumulation $$0$$. $$\frac{1}{1 - z^2}$$ ($$\{ 1, 0, 1, 0, \cdots \}$$) has two points of accumulation, $$0$$ and $$1$$.

One useful property of a sequence of numbers is its limit superior, written $$\limsup a_n$$. This is the least upper bound of the set of points of accumulation of the sequence $$\{a_n\}$$.

In our above examples, these would be $$1$$, $$0$$ and $$1$$ respectively.

Convergence
$$f(z)$$ is said to converge if its series expansion $$\sum a_n z^n$$ equals a finite value.

It may only do so for particular values of $$z$$. There are various tests for whether or not a series converges and for which values of $$z$$.

For example, $$\frac{1}{1 - z}$$ has series expansion $$\sum z^n$$. We can test this series for convergence with the D'Alembert's ratio test which states that the series converges if


 * $$\lim_{n \to \infty} \frac{a_{n+1}}{a_n} < 1$$

In our example, the ratio is $$\frac{z^{n+1}}{z^n}$$ which is only less than $$1$$ if $$|z| < 1$$. Therefore, the series converges for values less than $$1$$.

The radius of convergence of $$f(z)$$ is the value $$z_0$$ such that for $$|z| < z_0$$ the series expansion converges.

In our example, the radius of convergence of $$\frac{1}{1 - z}$$ is $$1$$.

It should be noted that the radius of convergence is equal to the smallest singularity of a function. We will read about singularities later.

Theorem
If $$f(z) = \sum a_n z^n$$ and $$R$$ its radius of convergence then :
 * $$\frac{1}{R} = \limsup_{n \to \infty} |a_n|^{1/n}$$

One consequence of this theorem is :
 * $$\left( \frac{1}{R} - \epsilon \right)^n \leq a_n \leq \left( \frac{1}{R} + \epsilon \right)^n$$ (for all $$\epsilon > 0$$ and sufficiently large $$n$$)

Proof
Proof due to Wilf and Lang.

The radius of convergence $$R$$ of a function $$f(z)$$ means that if $$|z| < R$$ then $$f(z) \ne \infty$$.

Take $$f(z) = \sum a_n z^n$$, $$R$$ its radius of convergence and $$t = \limsup_{n \to \infty} |a_n|^{1/n}$$. By definition of $$\limsup$$, for all but a finite number of $$n$$


 * $$|a_n| \le (t + \epsilon)^n$$.

$$f(z)$$ does not converge if $$|z| \ge \frac{1}{(t + \epsilon)}$$ (because otherwise $$\frac{a_{n+1} z^{n+1}}{a_n z^n} > 1$$ for all $$n$$ and so diverges by D'Alembert's ratio test), therefore $$R \ge \frac{1}{(t + \epsilon)}$$.

By definition of $$\limsup$$, there exist infinitely many $$n$$


 * $$|a_n| \ge (t - \epsilon)^n$$.

$$f(z)$$ does not converge if $$|z| = \frac{1}{(t - \epsilon)}$$, therefore $$R \le \frac{1}{(t - \epsilon)}$$.

If $$R \ge \frac{1}{(t + \epsilon)}$$ and $$R \le \frac{1}{(t - \epsilon)}$$ then $$R = \frac{1}{t}$$ and $$\frac{1}{R} = \limsup_{n \to \infty} |a_n|^{1/n}$$.

Now, we prove the consequence of the theorem.

If, $$\frac{1}{R} = \limsup_{n \to \infty} |a_n|^{1/n}$$ and, by definition of $$\limsup$$, for all but a finite number of $$n$$


 * $$|a_n| \le \left (\frac{1}{R} + \epsilon \right )^n$$

and there exist infinitely many $$n$$


 * $$|a_n| \ge \left (\frac{1}{R} - \epsilon \right)^n$$.

Complex numbers
A complex number is a number $$z = x + iy$$ where $$x$$ and $$y$$ are both real numbers and $$i$$ is the imaginary unit where $$i^2 = -1$$. $$x$$ is called the real component and $$y$$ the imaginary component (even though $$y$$ is itself a real number).

Contour integration
Because a complex number has two components, real and imaginary, complex integration involves integrating around a curve in the two-dimensional plane. This is called contour integration.

We denote this:


 * $$\int_C f(z) dz$$

where $$C$$ denotes the contour.

It is not necessary to know how to compute contour integrals in order to understand the later material in this book.

Analytic functions
A function $$f(z)$$ is analytic at a point $$z_0$$ if it is defined, single-valued and has a derivative at every point at and around $$z_0$$.

We say a function $$f(z)$$ is analytic on a set of points $$U$$ if it is analytic at every point of $$U$$.

One property of an analytic function is that when performing contour integration on a closed contour $$C$$ we can continuously deform the contour $$C$$ into another closed contour $$C^'$$ without changing the value of the integral (as long as in deforming the contour we do not pass through any singularities).

Cauchy's integral formula
Cauchy's integral formula states:


 * $$f(z_0) = \frac{1}{2 \pi i} \int_C f(z) \frac{dz}{z - z_0}$$

where $$C$$ is a contour, $$z_0$$ is a point inside $$C$$ and $$f(z)$$ is analytic on and inside the contour.

Proof: Because $$f(z)$$ is analytic, we can replace the integral around $$C$$ with a contour $$\gamma$$ with centre $$z_0$$ and radius $$\rho$$


 * $$\frac{1}{2 \pi i} \int_C f(z) \frac{dz}{z - z_0} = \frac{1}{2 \pi i} \int_\gamma f(z) \frac{dz}{z - z_0}$$

As $$f(z)$$ is analytic it is also continuous. This means for any $$\epsilon > 0$$ there exists a $$\delta > 0$$ such that $$|z - z_0| < \delta \implies |f(z) - f(z_0)| < \epsilon$$. We can do this by setting $$\rho \leq \delta$$.


 * $$\int_\gamma f(z) \frac{dz}{z - z_0} = f(z_0) \int_\gamma \frac{dz}{z - z_0} + \int_\gamma \frac{f(z) - f(z_0)}{z - z_0} dz$$


 * $$f(z_0) \int_\gamma \frac{dz}{z - z_0} = 2 \pi i f(z_0)$$


 * $$\int_\gamma \frac{f(z) - f(z_0)}{z - z_0} dz \leq \frac{\epsilon}{\rho} 2 \pi \rho = 2 \pi \epsilon$$

Finally,


 * $$\frac{1}{2 \pi i} \int_C f(z) \frac{dz}{z - z_0} = \frac{1}{2 \pi i} \int_\gamma f(z) \frac{dz}{z - z_0} = \frac{1}{2 \pi i} (2 \pi i f(z_0) + 2 \pi \epsilon) = f(z_0)$$ as $$\epsilon \to 0$$.

Taylor series
If $$f(z)$$ is analytic inside and on a contour $$C$$, the Taylor series expansion of $$f(z)$$ around the point $$z_0$$ inside $$C$$:


 * $$f(z) = f(z_0) + f'(z_0) (z - z_0) + \frac{f''(z_0) (z - z_0)^2}{2!} + \cdots$$

Cauchy's coefficient formula
Cauchy's coefficient formula states that:


 * $$a_n = \frac{1}{2 \pi i} \int_C f(z) \frac{dz}{z^{n+1}}$$

Proof: Cauchy's integral formula states:


 * $$f(z_0) = \frac{1}{2 \pi i} \int_C f(z) \frac{dz}{z - z_0}$$

If you differentiate both sides with respect to $$z_0$$ $$n$$ times, you get:


 * $$\frac{f^{(n)}(z_0)}{n!} = \frac{1}{2 \pi i} \int_C f(z) \frac{dz}{(z - z_0)^{n+1}}$$

The Taylor series expansion of $$f(z)$$ around $$0$$:


 * $$f(z) = f(0) + f'(0)z + \frac{f''(0)z^2}{2!} + \cdots$$

Therefore:


 * $$a_n = \frac{f^{(n)}(0)}{n!} = \frac{1}{2 \pi i} \int_C f(z) \frac{dz}{z^{n+1}}$$

Theorem
Theorem due to Titchmarsh.

If $$R$$ is the radius of convergence of $$f(z)$$, for all $$n \geq 0$$ and $$0 < r < R$$
 * $$|a_n| \leq \frac{\max_{|z| = r} |f(z)|}{r^n}$$

Proof
Proof due to Titchmarsh.

By Cauchy's coefficient formula:


 * $$a_n = \frac{1}{2 \pi i} \int_{|z| = r} f(z) \frac{dz}{z^{n+1}}$$

We have :


 * $$|a_n| = \left|\frac{1}{2 \pi i} \int_{|z| = r} f(z) \frac{dz}{z^{n+1}}\right| = \frac{1}{2 \pi i} \int_{|z| = r} |f(z)| \frac{dz}{z^{n+1}}$$

and


 * $$\int_{|z| = r} |f(z)| dz \leq \max_{|z| = r} |f(z)| 2 \pi r$$

Therefore:


 * $$|a_n| = \frac{1}{2 \pi i} \int_{|z| = r} |f(z)| \frac{dz}{z^{n+1}} \leq \frac{1}{2 \pi} \max_{|z| = r} |f(z)| \frac{2 \pi r}{r^{n+1}} = \frac{\max_{|z| = r} |f(z)|}{r^n}$$

Pictorially, we are estimating the contour integral by taking $$|f(z)|$$ always at its maximum around the entire contour, shown by the green ring below.