From the examples in the O.P., I assume that we are dealing with homogeneous quadratic polynomials, that is, with quadratic forms.
Let us state the problem: given a quadratic form p in Rn, determine the truth value of the proposition
(P) there exists x∈Rn+ such that p(x)>0.
Here, R+ stands for the set of positive real numbers. One can apply, at least, one of the following tests:
Test 1. Let A be the symmetric matrix associated to p, i.e., p(x)=xTAx. Compute the eigenvalues of A. If all the eigenvalues are less than or equal to 0, then (P) is false. If there exists a positive eigenvalue λ with an associated eigenvector v∈Rn+, then (P) is true.
Note that this test does not cover all the possibilities, since it only checks sufficient conditions.
Test 2. Compute the maximum of p on the set D=[0,1]n. Then (P) is true if and only if such a maximum is positive.
Test 1 can be implemented through the eigenvalues()
and eigenvectors_right()
methods for matrices. Test 2 can use the minimize_constrained
function, already presented in @dan_fulea’s answer. Please note that maximize p is equivalent to minimize −p.
Let us apply them to the given polynomials.
Example 1. Let p=−x2−y2−z2+xy+xz+yz. We apply Test 1:
sage: var("x,y,z");
sage: A = matrix([[-1,1/2,1/2],[1/2,-1,1/2],[1/2,1/2,-1]])
sage: A.eigenvalues()
[0, -3/2, -3/2]
By Test 1, (P) is false, since all the eigenvalues are less than or equal to 0. We could also apply Test 2:
sage: var("x,y,z");
sage: p = -x^2 - y^2 - z^2 + x*y + x*z + y*z
sage: sol = minimize_constrained(-p, [[0,1]]*3, [0.1,0.9,0.5])
sage: print "Maximum is", p(*sol), "attained at", sol
Maximum is 0.0 attained at (0.4999999998931121, 0.5000000001068878, 0.5)
Since the maximum is not greater than 0, by Test 2, (P) is false.
The second argument of minimize_constrained
is a list of the intervals where x, y and z should be, that is, [0,1] for each variable. The last argument is a starting point of the iterative minimization process. In this example, if we take a different starting point, the maximum is also 0, but it can be reached at a different point (p is 0 on the line x=y=z).
Example 2. Let q=−x2−y2−z2+(3/2)(xy+xz+yz). We first use Test 1:
sage: var("x,y,z");
sage: A = matrix([[-1,3/4,3/4],[3/4,-1,3/4],[3/4,3/4,-1]])
sage: A.eigenvalues()
[1/2, -7/4, -7/4]
sage: A.eigenvectors_right()
[(1/2, [(1, 1, 1)], 1), (-7/4, [(1, 0, -1), (0, 1, -1)], 2)]
The matrix A has one positive eigenvalue, λ=1/2, with an associated
eigenvector v=(1,1,1) belonging to Rn+. Hence, (P) is true. Let us now apply Test 2:
sage: var("x,y,z");
sage: q = -x^2 - y^2 - z^2 + (3/2)*(x*y + x*z + y*z)
sage: sol = minimize_constrained(-q,[[0,1]]*3, [0.1,0.9,0.5])
sage: print "Maximum is", q(*sol), "attained at", sol
Maximum is 1.5 attained at (1.0, 1.0, 1.0)
Since the maximum is positive, by Test 2, (P) is true.
Rationale for Test 1. If A does not have a positive eigenvalue, then A is semidefinite negative, and so p(x)=xTAx≤0 for all x∈Rn. Consequently, (P) is false. Likewise, if there exists a positive eigenvalue λ with an associated eigenvector v∈Rn+, then p(v)=vTAv=λvTv=λ∥v∥2>0. Hence
(P) is true.
Rationale for Test 2. Since p is continuous and D is compact, there always exists at least one point x0∈D where p attains a maximum value in D, i.e. p(x0)≥p(x) for all x∈D. Now, if (P) is true, there exist x1∈Rn+ such that p(x1)>0. Let x2=x1/∥x1∥, which obviously belongs to D. We deduce that
p(x0)≥p(x2)=p(x1)∥x1∥2>0.
Conversely, assume that p(x0)>0. If x0 belongs to the interior of D, then (P) holds with x=x0. If x0 is in the boundary of D (so possibly with a null coordinate), by continuity of p, there exists a ball B centered at x0 where the sign of p is that of p(x0), that is, p is positive on B. Since B contains at least one point x3 in the interior of D, then (P) holds with x=x3.
Et dans QuadraticForm ?
Sadly, the restriction to positive vectors (a,b,c) is not the kind of things considered with QuadraticForm...
Something like this is helpful in the given situation?
(There is no
maximize_constrained
for whatever reason, so i had to take the opposite of the objective function, then go tominimize_constrained
. The starting point was taken far away, instead of the 0, 0, 0 point for the obvious reason.)