In relation with my previous question Scaled paraboloid and derivatives checking, I see that you fixed related to running the problem once. I wanted to try but I still have a problem with the derivatives checking and finite differences showed in the following code:
""" Unconstrained optimization of the scaled paraboloid component."""
from __future__ import print_function
import sys
import numpy as np
from openmdao.api import IndepVarComp, Component, Problem, Group, ScipyOptimizer
class Paraboloid(Component):
def __init__(self):
super(Paraboloid, self).__init__()
self.add_param('X', val=np.array([0.0, 0.0]))
self.add_output('f_xy', val=0.0)
def solve_nonlinear(self, params, unknowns, resids):
X = params['X']
x = X[0]
y = X[1]
unknowns['f_xy'] = (1000.*x-3.)**2 + (1000.*x)*(0.01*y) + (0.01*y+4.)**2 - 3.
def linearize(self, params, unknowns, resids):
""" Jacobian for our paraboloid."""
X = params['X']
J = {}
x = X[0]
y = X[1]
J['f_xy', 'X'] = np.array([[ 2000000.0*x - 6000.0 + 10.0*y,
0.0002*y + 0.08 + 10.0*x]])
return J
if __name__ == "__main__":
top = Problem()
root = top.root = Group()
#root.fd_options['force_fd'] = True # Error if uncommented
root.add('p1', IndepVarComp('X', np.array([3.0, -4.0])))
root.add('p', Paraboloid())
root.connect('p1.X', 'p.X')
top.driver = ScipyOptimizer()
top.driver.options['optimizer'] = 'SLSQP'
top.driver.add_desvar('p1.X',
lower=np.array([-1000.0, -1000.0]),
upper=np.array([1000.0, 1000.0]),
scaler=np.array([1000., 0.001]))
top.driver.add_objective('p.f_xy')
top.setup()
top.check_partial_derivatives()
top.run()
top.check_partial_derivatives()
print('\n')
print('Minimum of %f found at (%s)' % (top['p.f_xy'], top['p.X']))
First check works fine but the second check_partial_derivatives
gives weird results for FD :
[...] Partial Derivatives Check ---------------- Component: 'p' ---------------- p: 'f_xy' wrt 'X' Forward Magnitude : 1.771706e-04 Reverse Magnitude : 1.771706e-04 Fd Magnitude : 9.998228e-01 Absolute Error (Jfor - Jfd) : 1.000000e+00 Absolute Error (Jrev - Jfd) : 1.000000e+00 Absolute Error (Jfor - Jrev): 0.000000e+00 Relative Error (Jfor - Jfd) : 1.000177e+00 Relative Error (Jrev - Jfd) : 1.000177e+00 Relative Error (Jfor - Jrev): 0.000000e+00 Raw Forward Derivative (Jfor) [[ -1.77170624e-04 -8.89040341e-10]] Raw Reverse Derivative (Jrev) [[ -1.77170624e-04 -8.89040341e-10]] Raw FD Derivative (Jfd) [[ 0.99982282 0. ]] Minimum of -27.333333 found at ([ 6.66666658e-03 -7.33333333e+02])
And (may be not related) when I try to set root.fd_options['force_fd'] = True
(just to see), I get an error during the first check :
Partial Derivatives Check ---------------- Component: 'p' ---------------- Traceback (most recent call last): File "C:\Program Files (x86)\Wing IDE 101 5.0\src\debug\tserver\_sandbox.py", line 59, in File "d:\rlafage\OpenMDAO\OpenMDAO\openmdao\core\problem.py", line 1827, in check_partial_derivatives u_size = np.size(dunknowns[u_name]) File "d:\rlafage\OpenMDAO\OpenMDAO\openmdao\core\vec_wrapper.py", line 398, in __getitem__ return self._dat[name].get() File "d:\rlafage\OpenMDAO\OpenMDAO\openmdao\core\vec_wrapper.py", line 223, in _get_scalar return self.val[0] IndexError: index 0 is out of bounds for axis 0 with size 0
I work with OpenMDAO HEAD (d1e12d4).
答案 0 :(得分:2)
This is just a stepsize problem for that finite difference. The 2nd FD occurs at a different point (the optimum), and it must be more sensitive at that point.
I tried with central difference
top.root.p.fd_options['form'] = 'central'
And got much better results.
----------------
Component: 'p'
----------------
p: 'f_xy' wrt 'X'
Forward Magnitude : 1.771706e-04
Reverse Magnitude : 1.771706e-04
Fd Magnitude : 1.771738e-04
The exception when you set 'fd' is a real bug related to the scaler on the des_var being an array. Thanks for the report on that; we'll get a story up to fix it.