我实现了2D函数的无约束最小化:
代码非常简单(主要是为了将来可能会发现它有帮助的人参考):我的函数optimize(f, df, hess_f, method)
如下所示:
# prepare contour plot
fig = pt.figure()
xmesh, ymesh = np.mgrid[-5:5:100j,-5:5:100j]
fmesh = f(np.array([xmesh, ymesh]))
pt.axis("equal")
pt.contour(xmesh, ymesh, fmesh)
pt.contour(xmesh, ymesh, fmesh, 250)
# initial guess + first update of search direction
guesses = [np.array([5, 0.1])]
x = guesses[-1]
s = search_direction(df, hess_f, x, method)
not_done = True
while not_done:
# calculate step size using backtracking line search:
alpha_opt = backtracking_alpha(f, df, 0.5, 0.5, x, s)
# update step
next_guess = x + alpha_opt * s
guesses.append(next_guess)
# plot current step to our updating contour graph:
it_array = np.array(guesses)
pt.plot(it_array.T[-2], it_array.T[-1], "-")
# check stopping condition
if (np.linalg.norm(guesses[-2] - guesses[-1]) < 0.0001):
not_done = False
# prepare for next guess according to search direction method
x = guesses[-1]
s = search_direction(df, hess_f, x, method)
pt.show()
print("method {2} converged to: {0}, in {1} iterations".format(x, len(guesses), method))
我们得到了
method 0 converged to: [ 1.37484167e-04 -8.24905001e-06], in 22 iterations
method 1 converged to: [ 0. 0.], in 3 iterations
如果我的优化功能明确地要求我不仅需要最小化的功能(当然这是有意义的),而且还需要渐变和Hessian,那么唯一真正困扰我的事情。所以现在我基本上是&#34;硬编码&#34;这些功能如下:
def f2(x):
return 100*(x[0]-3)**2 + (x[1]-1)**2
def df2(x):
return np.array([200*(x[0]-3), 2*(x[1]-1)])
def hess_f2(x):
return np.array([200, 0, 0, 2]).reshape((2,2))
所以我的问题是,什么是#34; Pythonic&#34;生成函数计算输入函数函数的渐变和Hessian的方法,以适合我上面的实现的方式?我猜它是非常简单的,但我对Python的主要经验是编写脚本,所以我还没有完成这样的事情。谢谢!