Theorem 4.61. Lagrange Multiplier Rule.
Suppose that \(f\) and \(g\) are functions from \(\mathbb R^2\) into \(\mathbb R\) having continuous partial derivatives, and that \(f\) attains a maximum (or minimum) at \(\vect x_0=(x_{01},x_{02})\) subject to the condition \(g(\vect x)=0\text{.}\) If \(\grad g(\vect x_0)\neq\vect 0\text{,}\) then there exists \(\lambda\in\mathbb R\text{,}\) called a Lagrange multiplier, such that
\begin{equation}
\grad f(\vect x_0)
=\lambda\grad g(\vect x_0)\text{.}\tag{4.9}
\end{equation}
Written in components we have
\begin{equation}
\begin{aligned}
\frac{\partial}{\partial x_1}f(x_{01},x_{02})
\amp =\lambda\frac{\partial}{\partial x_1}g(x_{01},x_{02}), \\
\frac{\partial}{\partial x_2}f(x_{01},x_{02})
\amp =\lambda\frac{\partial}{\partial x_2}g(x_{01},x_{02}).
\end{aligned}\tag{4.10}
\end{equation}