使用scipy.optimize.least_squares将高斯混合模型拟合为光谱数据

时间:2018-05-30 20:54:20

标签: python-3.x scipy gaussian minimization mixture-model

我正在尝试使用scipy.minimize函数将3个高斯的总和拟合到实验数据中。

我需要你的帮助才能正确传递目标函数,因为这是错误似乎指向的地方。显然,我是Python的新手。

scipy.minimize的文档没有针对此案例的明确示例,之前提到的关于GMM的问题制定得很差。 请指教......

import numpy
from matplotlib import pyplot
from scipy.optimize import minimize, rosen_der, rosen_hess

x = numpy.arange(0, 55)
y = [-0.00058032, -0.00063992, -0.00057869, -0.00058395, -0.00053528,  -0.0002694, -0.0003716, -0.000284,    0.00104651,  0.00209935,  0.00360213,  0.00502779,  0.00625538,  0.00715873,   0.00753231,  0.00712235,  0.00689089,  0.0061677,  0.00544124,  0.00478251,  0.00487787,  0.00415067,  0.00368579,  0.00370327,  0.00323007,  0.0029862,   0.00250529,  0.00219493,  0.00212242,  0.00209026,  0.0020827,  0.00204044,  0.00218628,  0.00236552,  0.00245056,  0.00282404,  0.0031072,   0.00332862,  0.00351655,  0.00367349,  0.00387923,  0.00395812,  0.00388796,  0.00379902,  0.00369458,  0.00350222,  0.00337815,  0.0032241,  0.00306897,  0.00294152,  0.00276761,  0.00257586,  0.00231613,  0.00211727,  0.00190347]

# experimental data: y
# objective function that is to be minimized: G1 + G2 + G3 - y
def sumGauss(x, y, *args):
    m1, m2, m3, s1, s2, s3, k1, k2, k3 = args
    ret = -y
    ret += k1 * numpy.exp(-(x - m1)**2 / (2 * s1**2))
    ret += k2 * numpy.exp(-(x - m2)**2 / (2 * s2**2))
    ret += k3 * numpy.exp(-(x - m3)**2 / (2 * s3**2))
    return ret

initial_values = [15, 29, 43, 1, 1, 1, 1, 1, 1]

res = minimize(sumGauss(x, y), initial_values, method='trust-exact',
                jac=rosen_der, hess=rosen_hess,
                options={'gtol': 1e-8, 'disp': True})

以下是错误消息:

Traceback (most recent call last):
File "fit_gaussian.py", line 64, in <module>    res = minimize(sumGauss(x, y), params, method='trust-exact',
File "fit_gaussian.py", line 41, in sumGauss
m1, m2, m3, s1, s2, s3, k1, k2, k3 = args
ValueError: not enough values to unpack (expected 9, got 0)

1 个答案:

答案 0 :(得分:2)

只有一些事情需要改变。如果您想要拟合数据,最好使用 ;WITH CTE2(ProjectName,[Plan Start Date],[Plan End Date],[Actual Start Date],[Actual End Date]) AS ( SELECT 'PR-A','1/1/2006','1/4/2006','1/4/2007','1/5/2008' UNION ALL SELECT 'PR-B','1/1/2007','1/1/2008','4/4/2008','6/6/2008' UNION ALL SELECT 'PR-C','1/1/2004','1/1/2008','2/5/2001','2/2/2008' ) SELECT Projectname, 'Plan' AS [Type], [Plan start date] AS [Start Date], [Plan end date] AS [End Date] FROM Cte2 UNION ALL SELECT Projectname, 'Actual' AS [Type], [Actual start date], [Actual end date] FROM Cte2 ORDER BY Projectname, [Type] DESC

least_squares

使from scipy.optimize import least_squares成为一个numpy数组,因此您可以对其执行算术运算:

y

请勿解压缩y=numpy.array(y),它们应保持单一输入。优化函数通常适用于第一个输入参数,因此请将其向前移动:

args

然后运行优化,那些jacobians用于不同的功能,所以不要使用它们。通过关键字def sumGauss(args, x, y): m1, m2, m3, s1, s2, s3, k1, k2, k3 = args ret = -y ret += k1 * np.exp(-(x - m1)**2 / (2 * s1**2)) ret += k2 * np.exp(-(x - m2)**2 / (2 * s2**2)) ret += k3 * np.exp(-(x - m3)**2 / (2 * s3**2)) return ret

将您的函数所需的额外参数作为元组传递

args

您的优化参数: res = least_squares(sumGauss, initial_values, method='trf', args=(x,y))

情节,res.x是残差:

res.fun