我有一个Tensorflow代码:
.
.
leaky_relu(tf.matmul(tf.transpose(out2, perm=[0, 2, 1, 3]), tf.transpose(out2, perm=[0, 2, 3, 1])))
.
.
此处,tf.matmul
的输出的形状为(4, 9, 9)
。然后,将其输入到leaky_relu
层。
现在我的问题是,如果我使用Keras重写此代码,以下“翻译”是否正确?
.
.
out2 = Lambda(lambda x: K.dot(K.permute_dimensions(x, (0, 2, 1, 3)), K.permute_dimensions(x, (0, 2, 3, 1))), output_shape=(4,9,9))(out2)
out2 = Flatten()(out2)
out2 = Dense(324, kernel_initializer='glorot_normal', activation='linear')(out2)
out2 = LeakyReLU(alpha=.2)(out2)
out2 = Reshape((4, 9, 9))(out2)
.
.