My problem is this: I'm attempting a spectral decomposition of a random process via a (truncated) Karhunen-Loeve transform, but my covariance matrix is actually a 1-parameter family of matrices, and I need a way to estimate / visualize how my random process depends on this parameter. To do this, I need a way to track the eigenvectors produced by numpy.linalg.eigh().
So give you an idea of my issue, here's a sample toy problem: Suppose I have a set of points {xs}, and a random process R with covariance C(x,y) = 1/(1+a*(x-y)^2) which depends on the parameter a. For a random sample of grid points in the range [0,1] and a given choice of a (say, a=1), I can populate my covariance matrix and implement the Karhunen-Loeve transform using:
num_x = 1000
xs = numpy.array([numpy.random.uniform() for i in range(num_x)])
a0=1
def cov(x,y,a=a0): return 1/(1+a*(x-y)^2)
cov_m = np.array(map(lambda y: map(lambda x: cov(x,y),xs),xs))
w,v=numpy.linalg.eigh(cov_m)
z=numpy.random.standard_normal(num_x)
R=numpy.dot(v,z*w^0.5)
This will give me realizations of R with values defined at each of my random grid points xs. What I need to be able to do, however, is for a specific realization (which means a specific choice of my grid xs and my random coefficients z) to track how R varies with respect to the parameter a in my covariance function.
This would be easy to do if I could compute the covariance matrix symbolically and specify a after the fact. However, for large matrices this is not a plausible option. The alternative method is to find a way to keep track of each eigenvector returned by numpy.linalg.eigh(). Unfortunately, numpy appears to be reordering them so as to always list the smallest eigenvalue first; this means that, when I vary a, the eigenvectors get reordered unpredictably, and the dot product numpy.dot(v,z*w^0.5) is no longer assigning the same coefficient to the same eigenvector.
Is there a way around this?