我对深度学习和神经网络还很陌生。
我有一个具有文本和数字功能的数据集,我正在尝试使用给定的here方法来解决此问题
我将数据集分为具有文本(X_text)和数字(X_num)功能的两个数据集。我将text(X_text)中的所有列添加到单个列中,其他列删除了。然后,我在此列上运行TfidfVectorizer并将其转换为具有形状的数组(1905,20859)。 X_num的形状为(1905,34)
此后我使用的代码
from keras.models import Sequential
from keras.layers import Dense, Embedding, Flatten, LSTM, Input, Bidirectional, Concatenate
from keras.optimizers import adam
from keras import regularizers
from keras.backend import concatenate
from keras import Model
nlp_input = Input(shape=(20860,))
meta_input = Input(shape=(35,))
emb = Embedding(output_dim=32, input_dim=20859)(nlp_input)
nlp_output = Bidirectional(LSTM(128, dropout=0.3, recurrent_dropout=0.3, kernel_regularizer=regularizers.l2(0.01)))(emb)
x = concatenate([nlp_out, meta_input])
layer1 = Dense(32, activation='relu')(x)
layer2 = Dense(1, activation='sigmoid')(layer1)
model = Model(inputs=[nlp_input , meta_input], outputs=layer2)
optimizer=adam(lr=0.00001)
model.compile(optimizer=optimizer, loss='binary_crossentropy', metrics = ['binary_accuracy'])
我得到的错误是:
Traceback (most recent call last)
<ipython-input-51-d98028f8916d> in <module>
13 layer1 = Dense(32, activation='relu')(x)
14 layer2 = Dense(1, activation='sigmoid')(layer1)
---> 15 model = Model(inputs=[nlp_input , meta_input], outputs=layer2)
/anaconda3/lib/python3.6/site-packages/keras/legacy/interfaces.py in wrapper(*args, **kwargs)
89 warnings.warn('Update your `' + object_name + '` call to the ' +
90 'Keras 2 API: ' + signature, stacklevel=2)
---> 91 return func(*args, **kwargs)
92 wrapper._original_function = func
93 return wrapper
/anaconda3/lib/python3.6/site-packages/keras/engine/network.py in __init__(self, *args, **kwargs)
91 'inputs' in kwargs and 'outputs' in kwargs):
92 # Graph network
---> 93 self._init_graph_network(*args, **kwargs)
94 else:
95 # Subclassed network
/anaconda3/lib/python3.6/site-packages/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name)
229 # Keep track of the network's nodes and layers.
230 nodes, nodes_by_depth, layers, layers_by_depth = _map_graph_network(
--> 231 self.inputs, self.outputs)
232 self._network_nodes = nodes
233 self._nodes_by_depth = nodes_by_depth
/anaconda3/lib/python3.6/site-packages/keras/engine/network.py in _map_graph_network(inputs, outputs)
1364 layer=layer,
1365 node_index=node_index,
-> 1366 tensor_index=tensor_index)
1367
1368 for node in reversed(nodes_in_decreasing_depth):
/anaconda3/lib/python3.6/site-packages/keras/engine/network.py in build_map(tensor, finished_nodes, nodes_in_progress, layer, node_index, tensor_index)
1351 tensor_index = node.tensor_indices[i]
1352 build_map(x, finished_nodes, nodes_in_progress, layer,
-> 1353 node_index, tensor_index)
1354
1355 finished_nodes.add(node)
/anaconda3/lib/python3.6/site-packages/keras/engine/network.py in build_map(tensor, finished_nodes, nodes_in_progress, layer, node_index, tensor_index)
1351 tensor_index = node.tensor_indices[i]
1352 build_map(x, finished_nodes, nodes_in_progress, layer,
-> 1353 node_index, tensor_index)
1354
1355 finished_nodes.add(node)
/anaconda3/lib/python3.6/site-packages/keras/engine/network.py in build_map(tensor, finished_nodes, nodes_in_progress, layer, node_index, tensor_index)
1323 ValueError: if a cycle is detected.
1324 """
-> 1325 n
ode = layer._inbound_nodes[node_index]
1326
1327 # Prevent cycles.
AttributeError: 'NoneType' object has no attribute '_inbound_nodes'
我在其他地方读到可以使用Lambda层,该层将函数用作keras中的层,也许这就是问题的根源。但是据我所知,我没有函数可以调用。知道如何解决这个问题吗?
答案 0 :(得分:0)
将nlp_output
与meta_input
连接在一起时,您使用的是keras.backend.concatenate
,应该使用keras.layers.Concatenate
。以下代码应该起作用:
nlp_input = Input(shape=(20860,))
meta_input = Input(shape=(35,))
emb = Embedding(output_dim=32, input_dim=20859)(nlp_input)
nlp_output = Bidirectional(LSTM(128, dropout=0.3, recurrent_dropout=0.3, kernel_regularizer=regularizers.l2(0.01)))(emb)
x = Concatenate()([nlp_output, meta_input])
layer1 = Dense(32, activation='relu')(x)
layer2 = Dense(1, activation='sigmoid')(layer1)
model = Model(inputs=[nlp_input , meta_input], outputs=layer2)
optimizer=adam(lr=0.00001)
model.compile(optimizer=optimizer, loss='binary_crossentropy', metrics = ['binary_accuracy'])
注意:keras.backend
中的函数可以包装在Lambda层中,但是当已经有keras.layers
层提供您所需要的功能时,这没有多大意义。需要。对于您的情况,如果您想在Lambda层中使用keras.backend.concatenate
,则可以执行以下操作:
concatenated = keras.layers.Lambda(lambda x: keras.backend.concatenate(x))([input1, input2])