Calculus/Multivariate optimisation: The Lagrangian

The problem
In previous sections, we've discussed how, using calculus, we can find the optimal solution of a single-variate $$y = f(x)$$ by finding all points where $$f'(x)=0$$. But what if we're given a bivariate function - for example, $$z = f(x, y)$$? More importantly, what if we're given constraints to follow? The single-variate model does not scale at all.

One variable, one constraint
Consider the optimisation problem $$\min{f(x)}$$ given a constraint $$g(x) = h$$.

First write the constraint in such a way that it's equal to 0 - so $$g(x) - h = 0$$. Then the Lagrangian of the system is defined by $$L(x, \lambda) = f(x) + \lambda (g(x) - h)$$. We have two variables to optimise - $$x$$ and $$\lambda$$. Then find the derivatives with respect to the variables:

$$\frac{\partial L}{\partial x} = f'(x) + \lambda g'(x)$$

$$\frac{\partial L}{\partial \lambda} = g(x) - h$$

Set them to 0. Then the optimal set $$\{x, \lambda\}$$ is the solution to $$f'(x) + \lambda g'(x) = 0$$ and $$g(x) = h$$.

A simple univariate example
Then the Lagrangian system is $$L(x, \lambda) = 5x + 3 + \lambda (x^2 - 25)$$. Take the respective derivatives:

$$\frac{\partial L}{\partial x} = 5 + \lambda (2x)$$

$$\frac{\partial L}{\partial \lambda} = x^2 - 25$$

Set the second to zero - we get $$x = \plusmn 5$$. Substitute in first: we get $$5+10\lambda = 0$$, which is $$\lambda = -\frac{1}{2}$$. Substitute in second: we get $$\lambda = \frac{1}{2}$$. In this case, the optimal minimum is the set $$\{x, \lambda\} = \{-5, \frac{1}{2}\}$$ (which is what we're looking for) and the optimal maximum is the set $$\{x, \lambda\} = \{5, \frac{-1}{2}\}$$.

It is important to realise that the Lagrangian does not guarantee that a particular solution is a minimum - we need to test the solution ourselves - as in one case the solution was actually the maximum.

This is actually a pretty crappy example as you may have seen - it would have been perfectly appropriate in this case to simply test the optimisation problem with the only two valid values given the constraint! It gets more useful when we have multiple variables and constraints to consider.

Two variables, one constraint
Consider the optimisation problem $$\min{f(x, y)}$$ given a constraint $$g(x, y) = h$$.

The Lagrangian system is almost identical to the single-variable case discussed above, except that we have a system of three partial derivatives to consider (2 variables + 1 constraint): $$L(x, y, \lambda) = f(x, y) + \lambda (g(x, y) - h)$$. Now take the respective partial derivatives:

$$\frac{\partial L}{\partial x} = f'_x(x, y) + \lambda g'_x(x, y)$$ (the first variable x)

$$\frac{\partial L}{\partial y} = f'_y(x, y) + \lambda g'_y(x, y)$$ (the second variable y)

$$\frac{\partial L}{\partial \lambda} = g(x, y) - h$$ (the constraint)

Set them to 0 - and the optimal triplet $$\{x, y, \lambda\}$$ is the solution to that.

A bivariate example


Did we have to use the Lagrangian? Actually no. The problem could have been reduced to a univariate form by writing one variable of the constraint in terms of the other: $$y = \plusmn{\sqrt{25 - x^2}}$$, and substituting it into the optimisation problem:

$$\max{(5x + 3y)} = \max{(5x \plusmn 3 ({\sqrt{25 - x^2})})}$$

and use single-variable calculus techniques to solve the problem! But could you do this when there are three variables instead? No, because you will most likely only be able to reduce the problem to two variables.

The general form
In this section, consider a vector x of size n: $$\textbf{x} = \begin{pmatrix} x_1 \\ x_2 \\ x_3 \\ ... \\ x_n \end{pmatrix}$$.

Then take the partial derivatives with respect to the vectors x and λ: find$$\frac{\partial L}{\partial \textbf{x}}$$ and $$\frac{\partial L}{\partial \lambda}$$.

Notice that this system has m + n variables, and you'll need to take m + n partial derivatives as well. This can get quite messy. A solution is to use matrix calculus.

This may scare you, and you shouldn't be concerned. The average Calculus 3 course will only consider 2 to three variables.

The regularity condition
Knowledge of linear algebra is expected for this section; this is unlikely to be covered in an average Calculus 3 course as a result.

The regularity condition applies when considering the Lagrange FONC (first order necessary condition) Remember that it's a necessary condition. This means that


 * just because a point does satisfies the Lagrange FONC does not mean that it is a minimiser or a maximiser.
 * a point that does not satisfy the Lagrange FONC cannot be a minimiser or maximiser.

If this condition is not satisfied, the Lagrange FONC does not apply at that point. The reason this does not matter with one constraint is because a single vector is linearly independent by definition.