我正在尝试在tf.keras中重新实现研究论文代码,在init块中,其编写为:
with slim.arg_scope([slim.conv2d,separable_conv],activation_fn=tf.nn.relu6, normalizer_fn=slim.batch_norm):
with slim.arg_scope([slim.batch_norm], is_training=is_training, activation_fn=None):
with tf.variable_scope(name):
net = slim.conv2d(inputs, num_outputs=depth, kernel_size=3, stride=2, scope="conv") #padding same
我在tf.keras.layer.Conv2D参数中找不到normalizer_fn = slim.batch_norm的等效项。如何在喀拉拉邦实现这一目标?
我尝试过:
model.add(Conv2D("some arguments") #0
model.add(BatchNormalization())
这与上面的tf.contrib.slim代码等效吗?由于tf.contrib.slim的文档有限,我真的很困惑。