我一直在尝试解决这个问题,而且真的没有看到我的大脑能够用来向前推进的例子或任何事情。
目标是通过最小化真实数据与由需要合理估计的未知参数(高斯具有未知位置,幅度和宽度)产生的模型之间的总卡方来找到模型高斯曲线。 scipy.optimize.fmin已经出现,但我以前从未使用过这个,而且我对python仍然很新......
最终,我想将原始数据与模型一起绘制 - 我之前使用过pyplot,它只是生成模型并使用fmin让我完全不知所措,我基本上就在这里:
def gaussian(a, b, c, x):
return a*np.exp(-(x-b)**2/(2*c**2))
我已经看到了多种生成模型的方法,这让我感到困惑,因此我没有代码!我通过np.loadtxt导入了我的数据文件。
感谢任何可以建议框架或帮助的人。
答案 0 :(得分:1)
这样的模型拟合问题基本上涉及四个(或五个)主要步骤:
这是一个让你入门的实用例子:
import numpy as np
from scipy.optimize import minimize
from matplotlib import pyplot as pp
# function that defines the model we're fitting
def gaussian(P, x):
a, b, c = P
return a*np.exp(-(x-b)**2 /( 2*c**2))
# objective function to minimize
def loss(P, x, y):
yhat = gaussian(P, x)
return ((y - yhat)**2).sum()
# generate a gaussian distribution with known parameters
amp = 1.3543
pos = 64.546
var = 12.234
P_real = np.array([amp, pos, var])
# we use the vector of real parameters to generate our fake data
x = np.arange(100)
y = gaussian(P_real, x)
# add some gaussian noise to make things harder
y_noisy = y + np.random.randn(y.size)*0.5
# minimize needs an initial guess at the model parameters
P_guess = np.array([1, 50, 25])
# minimize provides a unified interface to all of scipy's solvers. you
# can also access them individually in scipy.optimize, but the
# standalone versions have annoying differences in their syntax. for now
# we'll use the Nelder-Mead solver, which doesn't use the Jacobian. we
# also need to hand it x and y_noisy as additional args to loss()
res = minimize(loss, P_guess, method='Nelder-Mead', args=(x, y_noisy))
# res is a dict containing the results of the optimization. in particular we
# want the optimized model parameters:
P_fit = res['x']
# we can pass these to gaussian() to evaluate our fitted model
y_fit = gaussian(P_fit, x)
# now let's plot the results:
fig, ax = pp.subplots(1,1)
ax.hold(True)
ax.plot(x, y, '-r', lw=2, label='Real')
ax.plot(x, y_noisy, '-k', alpha=0.5, label='Noisy')
ax.plot(x, y_fit, '--b', lw=5, label='Fit')
ax.legend(loc=0, fancybox=True)
*一些求解器,例如共轭梯度法,将雅可比作为一个额外的论证,总的来说这些求解器更快更强大,但如果你感觉懒惰而且表现并不那么重要那么你通常可以在不提供雅可比的情况下逃脱,在这种情况下,它将使用有限差分方法来估计梯度。
您可以阅读有关不同解算器的更多信息here