在Keras中连接两个嵌入层时,如何解决“索引[1,0] = 66不在[0,25)中”错误?

时间:2019-04-17 12:58:44

标签: tensorflow keras concatenation embedding

我正在构建一个神经网络,其中有两个分类特征,它们分别独立地通过嵌入层,然后它们的嵌入未成功连接在一起。

x1 = np.random.randint(24, size = (20,1))
x2 = np.random.randint(100, size = (20,1))
X_list = [x1,x2]
label_array = np.random.randint(2,size = (20,1))

input1 = Input(shape=(1,))
output1 = Embedding(input_dim = 25, output_dim = 10)(input1)
output1 = Reshape(target_shape=(10,))(output1)

input2 = Input(shape=(1,))
output2 = Embedding(input_dim = 101, output_dim = 10)(input2)
output2 = Reshape(target_shape=(10,))(output2)

inputs = [input1, input2]
output_embeddings = [ output1, output2]

output_model = Concatenate()(output_embeddings)
output_model = Dense(500, activation='relu')(output_model)
output_model = Dense(1, activation='sigmoid')(output_model)

model = Model(inputs = input_model, outputs = output_model)
model.compile(optimizer = 'adam', loss='binary_crossentropy')

model.predict(X_list)

当input_dim变量(词汇量)相等时

  

model.predict(X_list)   返回一个向量,而不是错误,但是如果嵌入层中的input_dim具有与示例中不同的大小,则会出现此错误:   InvalidArgumentError:索引[1,0] = 66不在[0,25)        [[{{node embedding_23 / embedding_lookup}} = GatherV2 [Taxis = DT_INT32,Tindices = DT_INT32,Tparams = DT_FLOAT,_device =“ / job:localhost / replica:0 / task:0 / device:CPU:0”] [embedding_23 / embeddings / read,embedding_23 / Cast,embedding_23 / embedding_lookup / axis)]]

我的猜测是词汇量应该相同,但是当词汇量不相等时该怎么办?

0 个答案:

没有答案