2019-04-09 00:34:58 -0500 | received badge | ● Popular Question (source) |

2019-04-03 18:03:34 -0500 | received badge | ● Famous Question (source) |

2016-12-12 13:01:25 -0500 | received badge | ● Notable Question (source) |

2016-12-12 13:01:25 -0500 | received badge | ● Popular Question (source) |

2015-12-07 12:41:26 -0500 | received badge | ● Famous Question (source) |

2015-08-19 05:19:22 -0500 | received badge | ● Notable Question (source) |

2015-03-17 02:59:58 -0500 | received badge | ● Good Question (source) |

2015-02-23 01:47:09 -0500 | received badge | ● Notable Question (source) |

2014-11-10 09:44:51 -0500 | received badge | ● Famous Question (source) |

2014-06-29 11:01:01 -0500 | received badge | ● Famous Question (source) |

2014-06-29 11:01:01 -0500 | received badge | ● Popular Question (source) |

2014-06-29 11:01:01 -0500 | received badge | ● Notable Question (source) |

2014-06-28 20:15:03 -0500 | marked best answer | Where is walk.py located after I attach walk.sage? The sage tutorial Loading and Attaching Sage files states that if I create a file such as walk.sage containing: and then load or attach it: I should get a walk.py file containing python code in the same directory as walk.sage. I can find no such file (I've even searched my the rest of my computer). Does anyone know where the file is? |

2014-06-28 20:15:03 -0500 | marked best answer | How do I get an ordered list of a symbolic functions arguments? How can I get a list/tuple of the variables in a symbolic function with the same ordering as when the function was defined? e.g. for the function below I would want (z,t) not the alphabetically ordered (t, z) I get with .variables() of .arguments(). The ordering has to be stored/used somewhere in sage because I can differentiate with respect to z and get D0(z,t) as an answer where the '0' corresponds to 'z'. |

2014-06-28 20:13:09 -0500 | marked best answer | How do I perform a change of variables for a pde How can I transform, step by step, the partial differential equation (pde) u.diff(z,z) which gives me the following output: |

2014-01-20 15:15:27 -0500 | received badge | ● Popular Question (source) |

2013-11-19 09:38:11 -0500 | received badge | ● Popular Question (source) |

2013-03-27 13:47:49 -0500 | commented answer | sympy codegen with indices To get your 'answer code to work I need: 'from sympy.utilities.codegen import codegen' and 'from sympy import Eq' |

2013-01-03 20:51:31 -0500 | received badge | ● Notable Question (source) |

2012-07-02 13:14:49 -0500 | received badge | ● Popular Question (source) |

2012-06-12 06:25:27 -0500 | received badge | ● Famous Question (source) |

2011-12-09 06:55:01 -0500 | answered a question | Defining constraint eqations for minimize_constrained Thanks @DSM. I've tried to generalize your solution with the following: running Two further questions:
1.Is there a way to determine the number of values returned by a function without actually evaluating the function? If there was I could further generalise |

2011-12-08 08:07:44 -0500 | commented answer | Defining constraint eqations for minimize_constrained On a different note, within get2 I did try to use a list construction to specify my constraints: [lambda x: fn(x)[i] for i in range(1,5)]. This gave the answer (739.437607818, -608.579643471) for the non-cached version and still gave errors for the cached version; not sure how/why that approach stuffed up. |

2011-12-08 08:02:16 -0500 | answered a question | Defining constraint eqations for minimize_constrained @DSM I can successfully run your example but if I adapt your solution to the example in the minimize_constrained documentation then I get an error for the cached version. For the non-cached version I get the correct result (45.0, 6.25). If I run |

2011-12-07 11:50:50 -0500 | asked a question | Defining constraint eqations for minimize_constrained I'm trying to think of a way to do constrained optimization where both the objective function that I want to minimise and my constraint function are calculated within the same overarching function. Lets say I have a function like the following: Is there a way to minimize f by changing x,y,z subject to g>=0? Looking at the documentation for But that would mean On my internet wanderings I found the recently (June 6, 2011) released pyOpt 1.0 (journal article) which at first glance looks well suited to the problem. I see OpenOpt is an experimental package for sage. I'm not sure if openOpt is suitable; the pyOpt documentation is, at first glance, clearer. Any chance pyOpt could be made an optional package with sage, it's published under the GNU Lesser General Public License? |

2011-12-02 18:33:31 -0500 | marked best answer | How do I perform a change of variables for a pde With some insights from the following posts: substitute expression instead of formal function symbol How do I get an ordered list of a symbolic functions arguments? Call a function with argument list in python (non-sage link) I've come up with a function, transform_func_x(expr,f,rels), to change the dependent variables of an expression. Below is the function itself and below that how I've used it (all copy and pasted from a sage notebook). Any suggested improvements are welcome. (more) |

2011-12-02 18:33:28 -0500 | received badge | ● Nice Question (source) |

2011-11-28 09:54:48 -0500 | received badge | ● Commentator |

2011-11-28 09:54:48 -0500 | commented question | solve gives 1, 2, or 3 answers depending if one value in my equation is a real, rational, or integer In addition to my first comment to @kcrisman for the case where g = 1/4 if I run solve() on the incomplete answer that solve() originally gave me then I get the correct answer: solve(1/2*sqrt(4*x + 1) == h,x) gives: x == h^2 - 1/4 Why doesn't solve() give me that answer in the first place? |

2011-11-28 09:49:11 -0500 | commented question | solve gives 1, 2, or 3 answers depending if one value in my equation is a real, rational, or integer Also, in my original question if I replace sol2 = solve(eqs[1].subs(sol1[0]),x_0) with sol2 = solve(eqs[1].subs(sol1[0]).expand(),x_0), i.e. expand before solving, I get x0 = some function of x0 which is wrong. |

2011-11-28 09:27:14 -0500 | commented question | solve gives 1, 2, or 3 answers depending if one value in my equation is a real, rational, or integer @kcrisman here's a smaller example that starts misbehaving for reals: reset() forget() var('h,x,g') assume (h>0) print(solve(sqrt(x+g)==h,x)[0]) print(solve(sqrt(x+1)==h,x)[0]) print(solve(sqrt(x+2.0)==h,x)[0]) print(solve(sqrt(x+1/4)==h,x)[0]) print(solve(sqrt(x+2.5)==h,x)[0]) myrr.<rr> = PolynomialRing(RR) print(solve(sqrt(x+rr)==h,x)[0]) which gives: x == h^2 - g x == h^2 - 1 x == h^2 - 2 1/2*sqrt(4*x + 1) == h 1/2*sqrt(2*x + 5)*sqrt(2) == h and a traceback error on the last example with rr (not enough space in comment to show). Other polynomial rings (CC,ZZ,QQ) gave the correct answer. |

2011-11-25 13:10:09 -0500 | asked a question | solve gives 1, 2, or 3 answers depending if one value in my equation is a real, rational, or integer Any idea why solve() is giving me different answers depending on what the value of 'g' is in my equations (see code below)? If g is a variable, an integer, or 1.0 then I get the two correct solutions (i.e. +&- a sqrt). If g is 1/4 or 1.1 I get an incomplete answer which I can get the two correct solutions by doing a solve on the solved solution. If g is a real whole number like 2,3 then I get 3 solutions: the two correct ones and a spurious one. I only discovered this behaviour when I tried running this I get: (more) |

2011-11-17 14:03:32 -0500 | marked best answer | Where is walk.py located after I attach walk.sage? (Edit: I originally thought that there wasn't a file walk.py, but I think that's wrong.) Look in If you want to produce a more permanent file |

Copyright Sage, 2010. Some rights reserved under creative commons license. Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.