I'm trying to use scipy.optimize.minimize()
for a constrained optimization. If we call the solution of the minimizazion problem s
, the constraints I'm trying to implement set s[0] = 0
and s[-1] = 1
. Here's a minimal script to reproduce the error I get:
import numpy as np
from scipy.optimize import minimize
def potential( x, y ):
x2 = x * x
x4 = x2 * x2
y2 = y * y
y5 = ( y - 5./3. ) * ( y - 5./3. )
y3 = ( y - 1./3. ) * ( y - 1./3. )
y34 = y3 * y3
x_1 = ( x - 1. ) * ( x - 1. )
x1 = ( x + 1. ) * ( x + 1. )
pot = 5. * np.exp( -x2 - y2 ) - 3. * np.exp( - x2 - y5 ) - 5. * np.exp( - x_1 - y2 ) - 5. * np.exp( - x1 - y2 ) + 0.2 * x4 + 0.2 * y34
return pot
def functional( q, x_new, y_new ):
I = 0
beta = 1./6.67
dl = 0.2
dl2 = dl**2
for k in range(1, len(q) - 1):
ngradU = ( potential(x_new[k+1], y_new[k+1]) - potential(x_new[k], y_new[k]) ) / dl
I += ( beta/dl2 * ( q[k+1] + q[k-1] - 2. * q[k] ) + beta/2. * (q[k+1] - q[k]) * ngradU )**2
return I
def delta(i, k):
if( i == k ):
return 1.
else:
return 0.
def derivative( q, x_new, y_new ):
beta = 1./6.67
dl = 0.2
dl2 = dl**2
der = np.zeros_like(q)
for j in range(len(q)):
for k in range(1, len(q) - 1):
ngradU = ( potential(x_new[k+1], y_new[k+1]) - potential(x_new[k], y_new[k]) ) / dl
deltas = beta/dl2 * ( delta(k+1, j) + delta(k-1,j) - 2. * delta(k,j) ) - beta/dl * ngradU * ( delta(k+1, j) - delta(k,j) )
der[j] += ( beta/dl2 * ( q[k+1] + q[k-1] - 2. * q[k] ) + beta/2. * (q[k+1] - q[k]) * ngradU ) * deltas
return der
### MAIN
x_new = [-0.75956243, -0.63528643, -0.49338262, -0.35764855, -0.23954228, -0.12408018, 0.02134918, 0.20379693, 0.32454737, 0.43119557, 0.54225072, 0.66211977, 0.78198882]
y_new = [0.55944712, 0.70475988, 0.83125047, 0.96420533, 1.11378898, 1.26572396, 1.38819039, 1.33494757, 1.18824942, 1.03070726, 0.87501396, 0.7275383, 0.57900219]
s = [ 0.07453653, 0.14188005, 0.21413291, 0.28632847, 0.35839168, 0.43770331, 0.51499522, 0.58833728, 0.66737105, 0.73637078, 0.80085735, 0.86481271, 0.92374577]
cons = ({'type': 'eq',
'fun' : lambda x: x[0],
'jac' : lambda x: x[0]},
{'type': 'eq',
'fun' : lambda x: x[-1],
'jac' : lambda x: x[-1]})
res = minimize(functional, s, method='SLSQP', jac=derivative, constraints=cons, options={'disp': True}, args=(x_new, y_new))
The error is the following:
ValueError: all the input array dimensions except for the concatenation axis must match exactly
The lengths of x_new
, y_new
and s
match perfectly, so my guess is that I'm doing something wrong with the constraints. Indeed, when I use method='CG'
with the exact same parameters - and not including contstraints - the calculation runs flawlessly.
Thanks for the help!
答案 0 :(得分:2)
该错误是jac
中的cons
。
'jac' : lambda x: x[0]
您的x
是一个数组,数组的jac
应该生成一个数组。您可以通过删除此jac
或通过正确的尺寸修复它来修复它。
'CG'方法无法处理约束,因此您的cons
被忽略,这就是为什么你没有在'CG'中找到这个错误。