我正在尝试在笔记本中实例化ESN类的两个对象,如下所示:
esn_1 = ESN(ESN_arch, activation, leak_rate, weights_variance, sparsity, sparseness)
esn_2 = ESN(ESN_arch, activation, leak_rate, weights_variance, sparsity, sparseness)
运行会话时,在sess.run(tf.global_variables_initializer())
行弹出以下错误:
FailedPreconditionError: Attempting to use uninitialized value initializers/ReservoirWeights
[[node initializers/ReservoirWeights/read (defined at /home/tah/Documents/Computation_EOC/esn-neuroevolution/ESN_Cell.py:38) ]]
我认为该错误基本上是由于我在类内部使用了变量作用域。但我似乎无法弄清楚到底是什么。我已经在这里进行了检查:https://stackoverflow.com/a/36016117。
此外,我担心删除variable_scope会导致一个更老的问题:-我将variable_scope与tf.AUTO_RESUSE
一起使用(最初是成功的)的最初原因是要获得所有变量的所有权重矩阵对象ESN的实例化具有相同的值。在打开variable_scope之前,请注意我对tf.set_random_seed(1234)
的使用。
ESN_Cell.py :
class ESN(rnn_cell_impl.RNNCell):
def __init__(...):
self.in_units = ESN_arch[0]
self.res_units = ESN_arch[1]
self.activation = activation
self.alpha = tf.cast(leak_rate, dtype=tf.float64)
self.weights_std = tf.cast(weights_std, dtype=tf.float64)
self.sparsity = tf.cast(sparsity, dtype=tf.float64)
self.sparseness = sparseness
tf.set_random_seed(1234)
with tf.variable_scope('initializers', reuse=tf.AUTO_REUSE):
self.weights_in = tf.get_variable("InputWeights", \
initializer=self.init_weights_in(self.weights_std),\
trainable=False, dtype=tf.float64)
# 'weights_in' is: [in_units x res_units]
self.weights_res = self.normalize_weights_res(tf.get_variable("ReservoirWeights", \
initializer=self.init_weights_res(self.weights_std),\
trainable=False, dtype=tf.float64))
# 'weights_res' is: [res_units x res_units]
self.bias = tf.get_variable("Bias", \
initializer=self.init_bias(self.weights_std),\
trainable=False, dtype=tf.float64)
# 'bias' is: [1, res_units]
self.spectral_radius = tf.get_variable("SpectralRadius",\
initializer=self.get_spectral_radius(self.weights_res),\
trainable=False, dtype=tf.float64)
if self.sparseness:
self.sparse_mask = tf.get_variable("SparseMatrix",\
initializer=self.init_sparse_matrix(self.weights_res), \
trainable=False, dtype=tf.float64)
self.weights_res = tf.multiply(self.weights_res, self.sparse_mask)
.
.
.
我的会话如下:
以tf.Session()作为会话:
# res_units is an int with value 100.
for p_neuron in range(res_units):
sess.run(tf.global_variables_initializer())
init_esn_state = np.zeros([1, res_units], dtype="float64")
print(type(p_neuron))
dist, initial, init_esn_2 = sess.run([dist_esn_1_2, initial, init_esn_2], \
feed_dict={leak_rate: alpha,\
inputs:esn_input,\
init_state:init_esn_state,\
pert_neuron:p_neuron})