Ask Your Question
1

Solve fails to identify the maximum of the entropy function

asked 2022-08-04 12:25:34 +0100

luis lastras gravatar image

The entropy function has its maximum at x=1/2

Yet when one uses solve:

x=var('x')
assume(x>0)
solve(derivative(-x*log(x) -(1-x)*log(1-x),x)==0,x)

sage returns something that is correct, but curiously non-specific

[log(x) == log(-x + 1)]

It almost seems like sage can't figure out that x=1/2 solves the above even after the assumption on x.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2022-08-04 15:10:15 +0100

cav_rt gravatar image

updated 2022-08-04 15:11:04 +0100

Instead,

solve(e^derivative(-x*log(x) -(1-x)*log(1-x),x)==1,x)

gives

[x == (1/2)]
edit flag offensive delete link more

Comments

Alternatives :

sage: Ex=-(x*log(x)+(1-x)*log(1-x))
sage: solve(Ex.diff(x),x)
[log(x) == log(-x + 1)]

Fails indeed. but :

sage: solve(Ex.diff(x),x, algorithm="sympy")
[x == (1/2)]
sage: solve(Ex.diff(x),x, algorithm="giac")
Warning, argument is not an equation, solving -ln(sageVARx)+ln(-sageVARx+1)=0
[1/2]
sage: solve(Ex.diff(x),x, algorithm="fricas")
[log(x) == log(-x + 1)]

Not directly translatable as of 9.7.beta6 :

sage: mathematica.Solve(Ex.diff(x)==0,x)
{{x -> 1/2}}

Somewhat trichodynamic :

sage: foo=solve(Ex.diff(x),x)[0]
sage: foo.operator()(*map(exp, foo.operands())).solve(x)
[x == (1/2)]

HTH,

Emmanuel Charpentier gravatar imageEmmanuel Charpentier ( 2022-08-04 18:32:40 +0100 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

Stats

Asked: 2022-08-04 12:25:34 +0100

Seen: 179 times

Last updated: Aug 04 '22