# How do I enter an imaginary number into a matrix

In a very simple example, I am created a 2 by 2 matrix like this

```
var('i')
ran40 = matrix(QQ,2,2,[[2*i,-2],[3,4]])
show(ran40)
```

I need to eventually call `show(ran40.rref())`

so that I can see the reduced row echelon form of the above matrix. But I get syntax errors in the creation of the matrix above.

How do I simply put the imaginary letter *i* into a matrix and then it actually be computed on. That is, the imaginart "part" is actually used in the RREF computation.

See also the question (and answer) at http://stackoverflow.com/questions/35...