Consider a polynomial in one variable with real coefficients \begin{equation} p(x) = a_0 + a_1 x + a_2 x^2 + \ldots + a_n x^n. \end{equation} By the fundamental theorem of algebra, we know that over the field of complex numbers $\mathbb{C}$, if we could with multiplicity there should be $n$ roots of $p(x)$. A common question to ask is

When are all roots of $p(x)$ real?

One instance where such a question crops up in the theory of partial differential equations concerns the *hyperbolic polynomials*: a linear partial differential equation has a well-posed initial value problem in the sense of Hadamard if and only if the corresponding symbol is a hyperbolic polynomial.^{1} The definition of hyperbolic polynomials is a bit tangential to the scope of this post; suffice it to say that it involves checking whether a certain univariate polynomial $p(x)$ has only real (but possibly repeated) roots.

I just learned through MathOverflow an interesting algorithm for checking the reality of roots of a univariate polynomial.

## Method

First form the polynomial $q(x)$ by dividing $p(x)$ by the greatest common divisor of $p(x)$ and its derivative $p'(x)$. This guarantees that $q(x)$ has no repeated roots. Let $m$ be the degree of $q(x)$.

Next consider the bivariate expression \begin{equation} Q(x,y) = \frac{ q(x) q'(y) - q(y) q'(x)}{x - y}. \end{equation} Notice that the numerator of $Q(x,y)$ is a bivariate polynomial that is degree $m$ in both $x$ and $y$. Notice further that the numerator vanishes identically when $x = y$. This means that the numerator contains $(x-y)$ as a factor, so $Q(x,y)$ is in fact a bivariate polynomial that is degree $m-1$ in both $x$ and $y$. In particular, there exists a $m\times m$ matrix $A = (a_{i,j})$ such that \begin{equation}\label{eq:QA} Q(x,y) = \sum_{i,j = 1}^m a_{i,j}\ x^{i-1} y^{j-1}. \end{equation}

**Claim**. The polynomial $q(x)$ (and hence by extension $p(x)$) has only real roots if and only if the matrix $A$ is positive definite.

## Proof

First, note that since $p(x)$ (and hence $q(x)$) have only real coefficients, any complex root will come in pairs, with both $c$ and $\bar{c}$ being roots of the polynomial.

Next, list the $m$ *distinct* roots of $q(x)$ as $r_1, \ldots, r_k, c_1, \bar{c}_1, \ldots c_l, \bar{c}_l$; here the $r_i$ are real roots and $c_i$ and $\bar{c}_i$ are conjugate complex roots.

Consider the vectors $R_i$ given by
\begin{equation}
R_i = \begin{pmatrix} 1 \newline r_i \newline r_i^2 \newline \vdots \newline r_i^{m-1} \end{pmatrix}.
\end{equation}
Similarly we can define the vectors $X_i$ and $\bar{X}_i$.
Since the $r_i, c_i, \bar{c}_i$ are distinct, a bit of linear algebra tells us that the vectors generated are linearly independent in $\mathbb{C}^m$.

Next define \begin{equation} C_i = \frac12 (X_i + \bar{X}_i), \qquad S_i = \frac{1}{2i} (X_i - \bar{X}_i). \end{equation} By construction we see that $C_i, S_i \in \mathbb{R}^m$. The linear independence implies that the set $\{R_i, C_i, S_i\}$ form a basis of $\mathbb{R}^m$.

The matrix $A$ turns out to be quite easy to analyze in this basis. This is due to the fact that when $\rho$ is a (possibly complex) root of $q(x)$, we see that \begin{equation} Q(x,\rho) = \frac{q(x) q'(\rho) - q'(x) q(\rho)}{x - \rho} = q'(r_i) \frac{q(x)}{x-\rho} \end{equation} where $q(x) / (x-\rho)$ is a polynomial (with possibly complex coefficients). This implies that \begin{equation} Q(\rho,\rho) = [ q'(\rho) ]^2. \end{equation} Additionally, if $\sigma$ is a different root from $\rho$, then \begin{equation} Q(\sigma,\rho) = 0. \end{equation}

The above computation shows immediately that $R_i^T A R_j = Q(r_i, r_j)$ vanishes when $i \neq j$ and is positive when $i = j$. This proves that if $p(x)$ only has real roots, then $A$ is positive definite.

For the reverse implication, we can consider \[ C_i^T A C_i = \frac14 [ Q(c_i, c_i) + Q(c_i, \bar{c}_i) + Q(\bar{c}_i,c_i) + Q(\bar{c}_i, \bar{c}_i) ]. \] The middle two terms vanish, and we have that this is equal to \begin{equation}\label{eq:cs} C_i^T A C_i = \frac14 [ (q'(c_i))^2 + (q'(\bar{c}_i))^2 ] . \end{equation} Similarly, \begin{equation}\label{eq:ss} S_i^T A S_i = - \frac14 [ (q'(c_i))^2 + (q'(\bar{c}_i))^2 ] . \end{equation} Comparing the two results of \eqref{eq:cs} and \eqref{eq:ss}, we see that both values are real, and have opposite signs, and hence at least one of the two is $\leq 0$. This shows that if $p(x)$ has a complex root, then $A$ cannot be positive definite, and the claim is proved.

- I am glossing over some details here. The statement is only strictly true when we study constant coefficient partial differential equations, in this case a proof can be found in section XII.3 in volume 2 of Hörmander's ALDPO. The case of variable coefficients is a bit more delicate; a discussion is found in sections XXIII.3 and XXIII.4 of volume 3 of ALDPO.
^{^}