How to create a constrained symbolic vector
I am trying to emulate the following Mathematica code that returns the desired result. https://www.dropbox.com/s/awk0dj5po9k... The part I can't figure is how to constrain a symbolic vector to 1 and still retain it's vector form Here's what I have so far, but it doesn't work:
p1,p2,p3=var('p1,p2,p3')
A1=matrix(QQ,[[1,-1],[-1,1]])
sum1=p1+p2
sum1=1
P1=matrix(SR,[p1,p2])
result1=P1*A1*P1.transpose()
result1.simplify_full()
Hi @Bsamuels
I tried to understand what you wanted to do!, but it's probably not what I wrote below, you should give more precision. Tell us what's wrong with my (probably unfortunate) attempt below to figure out what you wanted.
you have a button 101 010 (header when you write message) to tag the code part, use it.
OOps ! The first time I click on your link I did not see the mathematica code . disregard my comment above. Sorry.
I must add that upon review, my code is doing what is expected, but the addition of your line of code to normalize is extremely helpful. thank you.