如何修复ResNet50中的“操作数不能与形状(56、56、256)(56、56、64)一起广播”

时间:2019-05-11 12:24:50

标签: python resnet

我正在尝试使用激活前剩余模块类型构建自己的ResNet 50网络(由He et al 2015&2016出版物执行) 我的第一阶段是:BN => CONV => BN => ACT => POOL(max)将图像张量从(1,224,224,1)减少到(1,56,56,64)。 在下一阶段,我正在尝试将剩余的模块与三个模块堆叠在一起 在模块末尾,需要在快捷方式和主路径之间执行添加操作,但看来它们之间不兼容。

K.clear_session()

# the input tensor
X_input = Input(shape=(224,224,1))

# stage 1: input => BN => CONV => BN => ACT => POOL(max)
bn1_1   = BatchNormalization(epsilon=2e-5,momentum=0.9)(X_input)
conv1_1 = Conv2D(filters=64, kernel_size=(7,7), strides=(2,2), padding='same')(bn1_1)
bn1_2   = BatchNormalization(epsilon=2e-5,momentum=0.9)(conv1_1)
act1_1  = Activation('relu')(bn1_2)   
pool1_1 = MaxPooling2D(pool_size=(3,3), strides=(2,2), padding='same')(act1_1)

# tensor shape after stage 1 is (1, 56, 56, 64)

# first residual module

# copy the input
X_shortcut = pool1_1

# get the filters size
f1,f2,f3 = [64,64,256]

bnEps = 2e-5
bnMom = 0.9

# first block
bn1   = BatchNormalization(epsilon=bnEps, momentum=bnMom)(pool1_1)
act1  = Activation('relu')(bn1)
conv1 = Conv2D(filters=f1, kernel_size=(1,1), strides=(1,1), padding='valid')(act1)

# second block
bn2   = BatchNormalization(epsilon=bnEps, momentum=bnMom)(conv1)
act2  = Activation('relu')(bn2)
conv2 = Conv2D(filters=f2, kernel_size=(3,3), strides=(1,1), padding='same')(act2)

# third block
bn3   = BatchNormalization(epsilon=bnEps, momentum=bnMom)(conv2)
act3  = Activation('relu')(bn3)
conv3 = Conv2D(filters=f3, kernel_size=(1,1), strides=(1,1), padding='valid')(act3)

# addition
add = Add()([conv3,pool1_1])

# create a model
model = Model(inputs=X_input, outputs=conv3)
model.compile(optimizer='adam' ,loss='mse')

我得到的错误:

ValueError:操作数不能与形状(56、56、256)(56、56、64)一起广播

我不知道如何在这些阶段之间匹配形状:/

0 个答案:

没有答案