我正在尝试对Keras进行简单的测试。
下面是代码
inputs = Input(shape=(seq_length, latent_dim)) # seq_length=3, latent_dim=2
reshaped_d_inputs = Reshape((inputs.get_shape()[1] * inputs.get_shape()[2], ))(inputs)
print(reshaped_d_inputs.get_shape()) # (?, 6)
repeat_d_repeat = RepeatVector(seq_length)(reshaped_d_inputs)
repeat_d = Reshape((seq_length, inputs.get_shape()[1], inputs.get_shape()[2]))(repeat_d_repeat)
print(repeat_d.get_shape()) # (?, 3, 3, 2)
permuted_e = Permute((2, 1))(inputs)
reshaped_e_inputs = Reshape((inputs.get_shape()[1] * inputs.get_shape()[2], ))(permuted_e)
permuted_e_repeat = RepeatVector(seq_length)(reshaped_e_inputs)
repeat_e = Reshape((seq_length, inputs.get_shape()[2], inputs.get_shape()[1]))(permuted_e_repeat)
repeat_e = Permute((1, 3, 2))(repeat_e)
print(repeat_e.get_shape()) # (?, 3, 3, 2)
outputs = Concatenate(-1)([repeat_d, repeat_e])
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='rmsprop', loss='mse')
output_array = model.predict(input_array)
但是从keras.layers进行连接会发出错误。
TypeError跟踪(最近的呼叫 最后) 14打印(repeat_e.get_shape()) 15 ---> 16个输出= Concatenate(-1)([repeat_d,repeat_e]) 17模型=模型(输入=输入,输出=输出) 18 model.compile(optimizer ='rmsprop',loss ='mse')
d:\ igs_projects \ nlp_nlu \ venv \ lib \ site-packages \ keras \ engine \ base_layer.py 在通话中(自己,输入内容,**假人) 429'您可以通过以下方式手动构建它: 430'
layer.build(batch_input_shape)
') -> 431 self.build(解包单个(input_shapes)) 432 self.built =真 433d:\ igs_projects \ nlp_nlu \ venv \ lib \ site-packages \ keras \ layers \ merge.py 在构建中(self,input_shape) 355 for i in range(len(reduced_inputs_shapes)): 356 [delta] _inputs_shapes [i] [self.axis] -> 357 shape_set.add(tuple(reduced_inputs_shapes [i])) 358(如果len(shape_set)> 1: 359提高ValueError('
Concatenate
层需要'TypeError:不可散列的类型:'Dimension'
有想法合并两个张量层吗?