Elementary symmetric polynomials and optimization

The mth elementary symmetric polynomial of degree n

e_m(x_1, x_2, \ldots, x_n)

is the sum of all terms containing a product of m variables. So, for example,

\begin{align*} e_1(w, x, y, z) &= w + x + y + z \\ e_2(w, x, y, z) &= wx + wy + wz + xy + xz + yz \\ e_3(w, x, y, z) &= xyz + wyz + wxz + wxy \\ e_4(w, x, y, z) &= wxyz \end{align*}
These polynomials came up in the previous post. The problem was choosing weights to minimize the variance of a weighted sum of random variables can be solved using elementary symmetric polynomials.

To state the optimization problem more generally, suppose you want to minimize

t_1^2 x_1 + t_2^2x_2 + \cdots + t_n^2 x_n

where the ti and xi are positive and the ti sum to 1. You can use Lagrange multipliers to show that the solution is

t_i = \frac{e_n(x_1, x_2, \cdots, x_n)}{x_i \,e_{n-1}(x_1, x_2, \cdots, x_n)}

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *