检索scipy最小化功能最低错误

时间:2018-07-17 15:24:21

标签: python scipy minimize

在scipy.minimize收敛之后,是否有办法直接检索最小化的误差,或者必须将其直接编码为成本函数?

我只能检索似乎收敛的系数。

def errorFunction(params,series,loss_function,slen = 12):
    alpha, beta, gamma = params
    breakUps = int(len(series) / slen)
    end = breakUps * slen
    test = series[end:]
    errors = []

    for i in range(2,breakUps+1):
        model = HoltWinters(series=series[:i * 12], slen=slen,
                            alpha=alpha, beta=beta, gamma=gamma, n_preds=len(test))
        model.triple_exponential_smoothing()
        predictions = model.result[-len(test):]
        actual = test
        error = loss_function(predictions, actual)
        errors.append(error)
    return np.mean(np.array(errors))

opt = scipy.optimize.minimize(errorFunction, x0=x,
                   args=(train, mean_squared_log_error),
                   method="L-BFGS-B", bounds = ((0, 1), (0, 1), (0, 1))
                  )
#gets the converged values
optimal values = opt.x
#I would like to know what the error with errorFunction is when using opt.x values, without having to manually run the script again
#Is the minimum error stored somewhere in the returned object opt

1 个答案:

答案 0 :(得分:0)

根据我对函数scipy.optimize.minimize的了解,结果以OptimizeResult对象的形式返回。

根据此类文档(here),它具有属性fun,即“目标函数的值”。

因此,如果您执行opt.fun,则应获得所需的结果。 (您可以检索更多的值,例如Jacobian opt.jac,Hessian opt.hess等,如文档中所述)