Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Gauss distribution fit

Hello! I have a problem with fitting my data set with a Gauss (Normal) distribution.

data = [[90.00, 2.0], [97.40, 5.0], [104.8, 14.0], [112.2, 12.0], [119.6, 11.0], \
        [127.0, 6.0], [134.4, 3.0], [141.8, 1.0], [149.2, 2.0], [156.6, 1.0]]

var('sigma mu max x')
model(x) = max*(1/sqrt(2*pi*sigma**2))*exp(-(x-mu)**2/(2*sigma**2))

find_fit(data, model)

gives a result:

[sigma == 1.0, mu == 1.0, max == 1.0]

I tried to use a Python function instead of the symbolic one:

var('sigma mu max x')
def model(x,sigma,mu,max):
    return max*(1/(sigma*sqrt(2*pi)))*exp(-(x-mu)**2/(2*sigma**2))

find_fit(data, model, parameters=[sigma, mu, max], variables = [x])

and obtained the very same result:

[sigma == 1.0, mu == 1.0, max == 1.0]

Why does it give me this (obviously, incorrect) result? And what is a way to do it right?

Thanks.