AttributeError:“ NoneType”对象在尝试添加多个keras密集层时没有属性“ _inbound_nodes”

时间:2018-09-21 17:38:29

标签: python tensorflow keras

输入是3个具有1000个功能的独立通道。我试图通过独立的NN路径传递每个通道,然后将它们连接成一个平坦的层。然后在扁平层上应用FCN进行二进制分类。 我正在尝试将多个密集层添加在一起,就像这样:

def tst_1():

inputs = Input((3, 1000, 1))

dense10 = Dense(224, activation='relu')(inputs[0,:,1])
dense11 = Dense(112, activation='relu')(dense10)
dense12 = Dense(56, activation='relu')(dense11)

dense20 = Dense(224, activation='relu')(inputs[1,:,1])
dense21 = Dense(112, activation='relu')(dense20)
dense22 = Dense(56, activation='relu')(dense21)

dense30 = Dense(224, activation='relu')(inputs[2,:,1])
dense31 = Dense(112, activation='relu')(dense30)
dense32 = Dense(56, activation='relu')(dense31)

flat = keras.layers.Add()([dense12, dense22, dense32])

dense1 = Dense(224, activation='relu')(flat)
drop1 = Dropout(0.5)(dense1)
dense2 = Dense(112, activation='relu')(drop1)
drop2 = Dropout(0.5)(dense2)
dense3 = Dense(32, activation='relu')(drop2)
densef = Dense(1, activation='sigmoid')(dense3)

model = Model(inputs = inputs, outputs = densef)

model.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])

return model
model = tst_1()

model.summary()

但是我得到了这个错误:

  

/ build_map中的(usr / local / lib / python2.7 / dist-packages / keras / engine / network.pyc(张量,finished_nodes,nodes_in_progress,图层,node_index,tensor_index)      1310 ValueError:如果检测到循环。      1311“”“   -> 1312节点= layer._inbound_nodes [node_index]      1313      1314#防止循环。

     

AttributeError:“ NoneType”对象没有属性“ _inbound_nodes”

2 个答案:

答案 0 :(得分:3)

问题是使用inputs[0,:,1]分割输入数据不会作为keras层完成。

您需要创建一个Lambda层才能完成此操作。

以下代码:

from keras import layers
from keras.layers import Input, Add, Dense,Dropout, Lambda, Concatenate
from keras.layers import Flatten
from keras.optimizers import Adam
from keras.models import Model
import keras.backend as K


def tst_1(): 

    num_channels = 3
    inputs = Input(shape=(num_channels, 1000, 1))

    branch_outputs = []
    for i in range(num_channels):
        # Slicing the ith channel:
        out = Lambda(lambda x: x[:, i, :, :], name = "Lambda_" + str(i))(inputs)

        # Setting up your per-channel layers (replace with actual sub-models):
        out = Dense(224, activation='relu', name = "Dense_224_" + str(i))(out)
        out = Dense(112, activation='relu', name = "Dense_112_" + str(i))(out)
        out = Dense(56, activation='relu', name = "Dense_56_" + str(i))(out)
        branch_outputs.append(out)

    # Concatenating together the per-channel results:
    out = Concatenate()(branch_outputs)


    dense1 = Dense(224, activation='relu')(out)
    drop1 = Dropout(0.5)(dense1)
    dense2 = Dense(112, activation='relu')(drop1)
    drop2 = Dropout(0.5)(dense2)
    dense3 = Dense(32, activation='relu')(drop2)
    densef = Dense(1, activation='sigmoid')(dense3)

    model = Model(inputs = inputs, outputs = densef)

    return model

Net = tst_1()
Net.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])

Net.summary()

正确创建了所需的网络。

答案 1 :(得分:0)

感谢@ CAta.RAy

我以这种方式解决了:

import numpy as np
from keras import layers
from keras.layers import Input, Add, Dense,Dropout, Lambda
from keras.layers import Flatten
from keras.optimizers import Adam
from keras.models import Model
import keras.backend as K




def tst_1(): 
    inputs = Input((3, 1000))

    x1 = Lambda(lambda x:x[:,0])(inputs)
    dense10 = Dense(224, activation='relu')(x1)
    dense11 = Dense(112, activation='relu')(dense10)
    dense12 = Dense(56, activation='relu')(dense11)

    x2 = Lambda(lambda x:x[:,1])(inputs)
    dense20 = Dense(224, activation='relu')(x2)
    dense21 = Dense(112, activation='relu')(dense20)
    dense22 = Dense(56, activation='relu')(dense21)

    x3 = Lambda(lambda x:x[:,2])(inputs)
    dense30 = Dense(224, activation='relu')(x3)
    dense31 = Dense(112, activation='relu')(dense30)
    dense32 = Dense(56, activation='relu')(dense31)

    flat = Add()([dense12, dense22, dense32])

    dense1 = Dense(224, activation='relu')(flat)
    drop1 = Dropout(0.5)(dense1)
    dense2 = Dense(112, activation='relu')(drop1)
    drop2 = Dropout(0.5)(dense2)
    dense3 = Dense(32, activation='relu')(drop2)
    densef = Dense(1, activation='sigmoid')(dense3)

    model = Model(inputs = inputs, outputs = densef)

    return model

Net = tst_1()
Net.compile(optimizer=Adam(), loss='binary_crossentropy', metrics=['accuracy'])

Net.summary()