Ask Your Question
1

positive values of polynomials

asked 2019-06-06 11:29:44 +0200

vdelecroix gravatar image

Let $p$ be a quadratic multivariate polynomial in Sage, for example

sage: R.<x,y,z> = QQ[]
sage: p = -x^2 - y^2 - z^2 + x*y + x*z + y*z
sage: q = -x^2 - y^2 - z^2 + 3/2 * (x*y + x*z + y*z)

I want to test whether there exists a vector $(a,b,c)$ with positive entries such that $p(a,b,c) > 0$. With the above examples the answer is no for $p$ and yes for $q$ since $q(1,1,1) = 3/2$. Is there a way to achieve that in SageMath? It is a very particular instance of non-linear optimization but did not find anything relevant.

edit retag flag offensive close merge delete

Comments

Et dans QuadraticForm ?

FrédéricC gravatar imageFrédéricC ( 2019-06-06 14:58:24 +0200 )edit

Sadly, the restriction to positive vectors $(a,b,c)$ is not the kind of things considered with QuadraticForm...

vdelecroix gravatar imagevdelecroix ( 2019-06-06 18:32:06 +0200 )edit
1

Something like this is helpful in the given situation?

sage: b = lambda k:   lambda c:    c[k]
sage: q = lambda c:   -( -c[0]^2 - c[1]^2 - c[2]^2 + 3/2 * (c[0]*c[1] + c[1]*c[2] + c[2]*c[0]) )
sage: p = lambda c:   -( -c[0]^2 - c[1]^2 - c[2]^2 +   1 * (c[0]*c[1] + c[1]*c[2] + c[2]*c[0]) )
sage: mq = minimize_constrained( q, [b(0), b(1), b(2)], [2019, 2019, 2019] )
sage: mp = minimize_constrained( p, [b(0), b(1), b(2)], [2019, 2019, 2019] )
sage: q(mq), p(mp)
(-10049449.023793787, 1.4901161193847656e-08)

(There is no maximize_constrained for whatever reason, so i had to take the opposite of the objective function, then go to minimize_constrained. The starting point was taken far away, instead of the 0, 0, 0 point for the obvious reason.)

dan_fulea gravatar imagedan_fulea ( 2019-06-07 19:30:46 +0200 )edit

1 Answer

Sort by » oldest newest most voted
2

answered 2019-06-07 22:47:46 +0200

Juanjo gravatar image

updated 2019-06-08 11:54:44 +0200

From the examples in the O.P., I assume that we are dealing with homogeneous quadratic polynomials, that is, with quadratic forms.

Let us state the problem: given a quadratic form \(p\) in \(\mathbb{R}^n\), determine the truth value of the proposition

(P) there exists \(\mathbf{x}\in\mathbb{R}_+^n\) such that \(p(\mathbf{x})>0\).

Here, \(\mathbb{R}_+\) stands for the set of positive real numbers. One can apply, at least, one of the following tests:

Test 1. Let \(A\) be the symmetric matrix associated to \(p\), i.e., \(p(\mathbf{x})=\mathbf{x}^TA\mathbf{x}\). Compute the eigenvalues of \(A\). If all the eigenvalues are less than or equal to \(0\), then (P) is false. If there exists a positive eigenvalue \(\lambda\) with an associated eigenvector \(\mathbf{v}\in\mathbb{R}_+^n\), then (P) is true.

Note that this test does not cover all the possibilities, since it only checks sufficient conditions.

Test 2. Compute the maximum of \(p\) on the set \(D=[0,1]^n\). Then (P) is true if and only if such a maximum is positive.

Test 1 can be implemented through the eigenvalues() and eigenvectors_right() methods for matrices. Test 2 can use the minimize_constrained function, already presented in @dan_fulea’s answer. Please note that maximize \(p\) is equivalent to minimize \(-p\).

Let us apply them to the given polynomials.

Example 1. Let \(p=-x^2 - y^2 - z^2 + xy + xz + yz\). We apply Test 1:

sage: var("x,y,z");
sage: A = matrix([[-1,1/2,1/2],[1/2,-1,1/2],[1/2,1/2,-1]])
sage: A.eigenvalues()
[0, -3/2, -3/2]

By Test 1, (P) is false, since all the eigenvalues are less than or equal to \(0\). We could also apply Test 2:

sage: var("x,y,z");
sage: p = -x^2 - y^2 - z^2 + x*y + x*z + y*z
sage: sol = minimize_constrained(-p, [[0,1]]*3, [0.1,0.9,0.5])
sage: print "Maximum is", p(*sol), "attained at", sol
Maximum is 0.0 attained at (0.4999999998931121, 0.5000000001068878, 0.5)

Since the maximum is not greater than \(0\), by Test 2, (P) is false. The second argument of minimize_constrained is a list of the intervals where \(x\), \(y\) and \(z\) should be, that is, \([0,1]\) for each variable. The last argument is a starting point of the iterative minimization process. In this example, if we take a different starting point, the maximum is also \(0\), but it can be reached at a different point (\(p\) is \(0\) on the line \(x=y=z\)).

Example 2. Let \(q= -x^2 - y^2 - z^2 + (3/2)(xy + xz + yz)\). We first use Test 1:

sage: var("x,y,z");
sage: A = matrix([[-1,3/4,3/4],[3/4,-1,3/4],[3/4,3/4,-1]])
sage: A.eigenvalues()
[1/2, -7/4, -7/4]
sage: A.eigenvectors_right()
[(1/2, [(1, 1, 1)], 1), (-7/4, [(1, 0, -1), (0, 1, -1)], 2)]

The matrix \(A\) has one positive eigenvalue, \(\lambda=1/2\), with an associated eigenvector \(\mathbf{v}=(1,1,1)\) belonging to \(\mathbb{R}_+^n\). Hence, (P) is true. Let us now apply Test 2:

sage: var("x,y,z");
sage: q = -x^2 - y^2 - z^2 + (3/2)*(x*y + x*z + y*z)
sage: sol = minimize_constrained(-q,[[0,1]]*3, [0.1,0.9,0.5])
sage: print "Maximum is", q(*sol), "attained at", sol
Maximum is 1.5 attained at (1.0, 1.0, 1.0)

Since the maximum is positive, by Test 2, (P) is true.

Rationale for Test 1. If \(A\) does not have a positive eigenvalue, then \(A\) is semidefinite negative, and so \(p(\mathbf{x})=\mathbf{x}^TA\mathbf{x}\leq 0\) for all \(\mathbf{x}\in\mathbb{R}^n\). Consequently, (P) is false. Likewise, if there exists a positive eigenvalue \(\lambda\) with an associated eigenvector \(\mathbf{v}\in\mathbb{R}_+^n\), then \(p(\mathbf{v})=\mathbf{v}^TA\mathbf{v}=\lambda\mathbf{v}^T\mathbf{v}=\lambda\lVert\mathbf{v}\rVert^2>0\). Hence (P) is true.

Rationale for Test 2. Since \(p\) is continuous and \(D\) is compact, there always exists at least one point \(\mathbf{x}_0\in D\) where \(p\) attains a maximum value in \(D\), i.e. \(p(\mathbf{x}_0)\geq p(\mathbf{x})\) for all \(\mathbf{x}\in D\). Now, if (P) is true, there exist \(\mathbf{x}_1\in \mathbb{R}_+^n\) such that \(p(\mathbf{x}_1)>0\). Let \(\mathbf{x}_2=\mathbf{x}_1/\lVert\mathbf{x}_1\rVert\), which obviously belongs to \(D\). We deduce that \[p(\mathbf{x}_0) \geq p(\mathbf{x}_2)=\frac{p(\mathbf{x}_1)}{\lVert\mathbf{x}_1\rVert^2}>0.\] Conversely, assume that \(p(\mathbf{x}_0)>0\). If \(\mathbf{x}_0\) belongs to the interior of \(D\), then (P) holds with \(\mathbf{x}=\mathbf{x}_0\). If \(\mathbf{x}_0\) is in the boundary of \(D\) (so possibly with a null coordinate), by continuity of \(p\), there exists a ball \(B\) centered at \(\mathbf{x}_0\) where the sign of \(p\) is that of \(p(\mathbf{x}_0)\), that is, \(p\) is positive on \(B\). Since \(B\) contains at least one point \(\mathbf{x}_3\) in the interior of \(D\), then (P) holds with \(\mathbf{x}=\mathbf{x}_3\).

edit flag offensive delete link more

Comments

Thanks. If I am not mistaken, test 2 via minimize_constrained uses double floating point precision and does not provide any guarantee (ie proof). What can of course be done is to evaluate the provided solution with ball arithmetic but there might be situations where numerical approximation is not good enough.

vdelecroix gravatar imagevdelecroix ( 2019-06-08 13:20:07 +0200 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2019-06-06 11:29:44 +0200

Seen: 264 times

Last updated: Jun 08 '19