Keras中的形状不兼容

时间:2019-04-14 03:58:28

标签: python tensorflow keras

我有以下代码。当我尝试merge([user_latent, item_latent], mode = 'concat')Shape (2, ?, ?) and () are incompatible。在搜索时,它说这是tensorflow API问题,但是由于我没有使用tf.concat(),因此我不确定如何更改它。

def get_model(num_users, num_items, layers = [20,10], reg_layers=[0,0]):

assert len(layers) == len(reg_layers)
num_layer = len(layers) #Number of layers in the MLP
# Input variables
user_input = Input(shape=(1,), dtype='int32', name = 'user_input')
item_input = Input(shape=(1,), dtype='int32', name = 'item_input')

MLP_Embedding_User = Embedding(input_dim = num_users, output_dim = layers[0]/2, name = 'user_embedding',
                              init = init_normal, W_regularizer = l2(reg_layers[0]), input_length=1)
MLP_Embedding_Item = Embedding(input_dim = num_items, output_dim = layers[0]/2, name = 'item_embedding',
                              init = init_normal, W_regularizer = l2(reg_layers[0]), input_length=1)   

# Crucial to flatten an embedding vector!
user_latent = Flatten()(MLP_Embedding_User(user_input))
item_latent = Flatten()(MLP_Embedding_Item(item_input))

# The 0-th layer is the concatenation of embedding layers
vector = merge([user_latent, item_latent], mode = 'concat')

# MLP layers
for idx in xrange(1, num_layer):
    layer = Dense(layers[idx], W_regularizer= l2(reg_layers[idx]), activation='relu', name = 'layer%d' %idx)
    vector = layer(vector)

# Final prediction layer
prediction = Dense(1, activation='sigmoid', init='lecun_uniform', name = 'prediction')(vector)

model = Model(input=[user_input, item_input], 
              output=prediction)

return model

0 个答案:

没有答案