使用rllab

时间:2018-12-27 05:40:02

标签: python theano

当我使用rllab时,有时会出现内存错误。此错误大约发生在500的第300次迭代中。它似乎是受游戏影响的,例如solaris和ventrue。诸如高速公路之类的某些游戏很少出现此问题。

Traceback (most recent call last):
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/site-packages/theano/compile/function_module.py", line 903, in __call__
    self.fn() if output_subset is None else\
MemoryError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/home/max/rllab/sandbox/haoran/parallel_trpo/batch_polopt_parallel.py", line 374, in _train
    self.baseline.fit(paths)
  File "/home/max/rllab/sandbox/adam/parallel/gaussian_conv_baseline.py", line 79, in fit
    self._regressor.fit(observations, returns.reshape((-1, 1)))
  File "/home/max/rllab/sandbox/adam/parallel/gaussian_conv_regressor.py", line 167, in fit
    self._optimizer.optimize(inputs)
  File "/home/max/rllab/sandbox/haoran/parallel_trpo/conjugate_gradient_optimizer.py", line 407, in optimize
    self._optimize(inputs, extra_inputs, subsample_inputs)
  File "/home/max/rllab/sandbox/haoran/parallel_trpo/conjugate_gradient_optimizer.py", line 486, in _optimize
    self._flat_g(inputs, extra_inputs)  # (parallel, writes shareds.flat_g)
  File "/home/max/rllab/sandbox/haoran/parallel_trpo/conjugate_gradient_optimizer.py", line 358, in _flat_g
    sliced_fun(self._opt_fun["f_grad"], self._num_slices)(inputs, extra_inputs)
  File "/home/max/rllab/rllab/misc/ext.py", line 352, in sliced_f
    slice_ret_vals = f(*(inputs_slice + non_sliced_inputs))
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/site-packages/theano/compile/function_module.py", line 917, in __call__
    storage_map=getattr(self.fn, 'storage_map', None))
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/site-packages/theano/gof/link.py", line 325, in raise_with_op
    reraise(exc_type, exc_value, exc_trace)
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/site-packages/six.py", line 685, in reraise
    raise value.with_traceback(tb)
  File "/home/max/anaconda3/envs/rllab3/lib/python3.5/site-packages/theano/compile/function_module.py", line 903, in __call__
    self.fn() if output_subset is None else\
MemoryError:
Apply node that caused the error: Elemwise{Composite{(i0 * (i1 + Abs(i1)))}}(TensorConstant{(1, 1, 1, 1) of 0.5}, Elemwise{Add}[(0, 0)].0)
Toposort index: 43
Inputs types: [TensorType(float64, (True, True, True, True)), TensorType(float64, 4D)]
Inputs shapes: [(1, 1, 1, 1), (8750, 16, 9, 9)]
Inputs strides: [(8, 8, 8, 8), (10368, 648, 72, 8)]
Inputs values: [array([[[[ 0.5]]]]), 'not shown']
Outputs clients: [[CorrMM{valid, (2, 2), (1, 1), 1 False}(Elemwise{Composite{(i0 * (i1 + Abs(i1)))}}.0, Subtensor{::, ::, ::int64, ::int64}.0), CorrMM_gradWeights{valid, (2, 2), (1, 1), 1 False}(Elemwise{Composite{(i0 * (i1 + Abs(i1)))}}.0, Elemwise{Composite{((i0 * i1) + (i0 * i1 * sgn(i2)))}}[(0, 1)].0, Subtensor{int64}.0, Subtensor{int64}.0)]]

HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

0 个答案:

没有答案