我试图将输入添加到并行路径cnn中,以形成残留的体系结构,但是我遇到了尺寸不匹配的情况。
from keras import layers, Model
input_shape = (128,128,3) # Change this accordingly
my_input = layers.Input(shape=input_shape) # one input
def parallel_layers(my_input, parallel_id=1):
x = layers.SeparableConv2D(32, (9, 9), activation='relu', name='conv_1_'+str(parallel_id))(my_input)
x = layers.MaxPooling2D(2, 2)(x)
x = layers.SeparableConv2D(64, (9, 9), activation='relu', name='conv_2_'+str(parallel_id))(x)
x = layers.MaxPooling2D(2, 2)(x)
x = layers.SeparableConv2D(128, (9, 9), activation='relu', name='conv_3_'+str(parallel_id))(x)
x = layers.MaxPooling2D(2, 2)(x)
x = layers.Flatten()(x)
x = layers.Dropout(0.5)(x)
x = layers.Dense(512, activation='relu')(x)
return x
parallel1 = parallel_layers(my_input, 1)
parallel2 = parallel_layers(my_input, 2)
concat = layers.Concatenate()([parallel1, parallel2])
concat=layers.Add()(concat,my_input)
x = layers.Dense(128, activation='relu')(concat)
x = Dense(7, activation='softmax')(x)
final_model = Model(inputs=my_input, outputs=x)
final_model.fit_generator(train_generator, steps_per_epoch =
nb_train_samples // batch_size, epochs = epochs, validation_data = validation_generator,
validation_steps = nb_validation_samples // batch_size)
我遇到了错误
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-48-163442df0d4c> in <module>()
1 concat = layers.Concatenate()([parallel1, parallel2])
----> 2 concat=layers.Add()(concat,my_input)
3 x = layers.Dense(128, activation='relu')(parallel2)
4 x = Dense(7, activation='softmax')(x)
5
TypeError: __call__() takes 2 positional arguments but 3 were given
我正在使用keras 2.1.6版本。请帮助解决此问题 final_model.summary()
答案 0 :(得分:0)
以这种方式定义您的添加层
concat=layers.Add()([concat,my_input])
答案 1 :(得分:0)
您必须删除以下行:
concat=layers.Add()(concat,my_input)
这没有任何意义。您有一个接受输入的方法,分为两个并行模型。它们(parallel1
和parallel2
)的输出都是长度为512
的向量。然后,您可以Concatenate
的长度为1024
,也可以Add
的长度再次为512
。 concat
然后经过另外的Dense
层。
因此,简而言之,删除以下行:
concat=layers.Add()(concat,my_input)
如果要连接并具有长度为1024的向量,请保留其余代码,否则,如果要添加它们并具有长度为512的向量,请替换以下行:>
concat = layers.Concatenate()([parallel1, parallel2])
与此:
concat = layers.Add()([parallel1, parallel2])