训练多个模型:ValueError:变量hidden1 / kernel已经存在,不允许

时间:2019-02-04 18:52:34

标签: python tensorflow

我正在执行超参数搜索:

for layer1_filters, layer1_kernels \
    in product(layer1_filters_list,layer1_kernels_list):
  cm = CifarModel()
  cm.build_train_validate_model(layer1_filters, layer1_kernels)

其中build_train_validate_model定义层:

self.hidden1 = tf.layers.conv2d(self.input_layer, layer1_filters,
        layer1_kernels, activation=activation, name='hidden1')

在超参数搜索循环的第二次迭代中,当定义了第二个候选模型时,出现此错误:

ValueError: Variable hidden1/kernel already exists, disallowed.
Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?

我感到困惑,因为超参数循环的每次迭代都在行cm = CifarModel()中创建了自己的模型对象,因此在超参数的不同迭代中self.hidden1 = tf.layers.conv2d的多次调用之间应该没有名称冲突搜索循环。因此产生了一个问题:为什么TF声称同一类的不同对象的成员之间存在名称冲突?

此外,我添加后

tf.AUTO_REUSE = True

在超参数搜索循环之前,问题仍然存在。


这是图形定义代码:

# Build the model
self.flat_features, self.flat_label = self.iterator.get_next()
self.input_layer = self.flat_features
self.hidden1 = tf.layers.conv2d(self.input_layer, layer1_filters,
        layer1_kernels, activation=activation)
self.normal1 = tf.layers.batch_normalization(self.hidden1)
self.hidden2 = tf.layers.conv2d(self.normal1, layer2_filters,
        layer2_kernels, activation=activation)
self.normal2 = tf.layers.batch_normalization(self.hidden2)
self.maxpool1 = tf.layers.max_pooling2d(self.normal2,
        (maxpool1_stride,maxpool1_stride), (maxpool1_stride,maxpool1_stride))
self.hidden3 = tf.layers.conv2d(self.maxpool1, layer3_filters,
        layer3_kernels, activation=activation)
self.maxpool2 = tf.layers.max_pooling2d(self.hidden3,
        (self.drop3.shape[1],self.drop3.shape[1]), (self.drop3.shape[1],self.drop3.shape[1]) )
self.flat = tf.layers.flatten(self.maxpool2)
self.logits = tf.layers.dense(self.flat, len(self.label_names))

0 个答案:

没有答案