scipy.optimize.minimize不会更新权重参数

时间:2017-12-05 16:17:23

标签: python numpy optimization scipy

以下是我用于更新beta矢量的代码。它是随机初始化的。但是,经过几次迭代后,β矢量不会更新。成本也不会下降。我错过了什么吗?

def objective_func(beta, features, spread, target):
    res = 0
    for i, feats in enumerate(features):
        W = feats.dot(beta)  # Series
        weighted_spread = W * spread[i]  # Series
        normalization = weighted_spread.sum()
        res += (target[i] - (weighted_spread / normalization).sum()) ** 2
    print(res)  # It always prints the same value
    return res

initial_beta = np.array([random() for _ in range(features[0].shape[1])])
print(initial_beta)
res = optimize.minimize(objective_func, x0=initial_beta,
                        args=(list(compress(features, training_mask)),
                              list(compress(spread, training_mask)),
                              list(compress(target, training_mask))),
                        method='L-BFGS-B',
                        callback=True)
print(initial_beta)
print(res.x)
print(res.success)
print(res.status)
print(res.message)
print(res.nit)

控制台:

[ 0.03935521  0.45679144  0.45673816  0.56107001]  # initial_beta before minimizing
228.625  # res
228.625
228.625
228.625
228.625
[ 0.03935521  0.45679144  0.45673816  0.56107001]  # initial_beta after minimizing
[ 0.03935521  0.45679144  0.45673816  0.56107001]  # res.x
True  # res.success
0  # res.status
b'CONVERGENCE: NORM_OF_PROJECTED_GRADIENT_<=_PGTOL'  # res.message
0  # res.nit

1 个答案:

答案 0 :(得分:0)

我只是弄清楚normalization = weighted_spread.sum()应该是normalization = W.sum()。数学问题。