如何从精细的车削模型喀拉斯石上去除FC层

时间:2019-07-08 03:11:35

标签: python keras artificial-intelligence

因此,我已经使用以下架构微调了Resnet50模型:

model = models.Sequential()
model.add(resnet)
model.add(Conv2D(512, (3, 3), activation='relu'))
model.add(Conv2D(512, (3, 3), activation='relu'))
model.add(MaxPooling2D((2, 2), strides=(2, 2)))
model.add(Flatten())
model.add(layers.Dense(2048, activation='relu'))
model.add(layers.Dropout(0.5))
model.add(layers.Dense(4096, activation='relu'))
model.add(layers.Dropout(0.5))
model.add(layers.Dense(736, activation='softmax')) # Output layer

因此,现在我有一个保存的模型(.h5),我想将其用作另一个模型的输入。但我不要最后一层。我通常会使用基本的resnet50模型这样做:

def base_model():
    resnet = resnet50.ResNet50(weights="imagenet", include_top=False)
    x = resnet.output
    x = GlobalAveragePooling2D()(x)
    x = Dense(4096, activation='relu')(x)
    x = Dropout(0.6)(x)
    x = Dense(4096, activation='relu')(x)
    x = Dropout(0.6)(x)
    x = Lambda(lambda  x_: K.l2_normalize(x,axis=1))(x)
    return Model(inputs=resnet.input, outputs=x)

但这不适用于该模型,因为它给了我一个错误。我现在正在尝试这样,但是仍然无法正常工作。

def base_model():
    resnet = load_model("../Models/fine_tuned_model/fine_tuned_resnet50.h5")
    x = resnet.layers.pop()
    #resnet = resnet50.ResNet50(weights="imagenet", include_top=False)
    #x = resnet.output
    #x = GlobalAveragePooling2D()(x)
    x = Dense(4096, activation='relu')(x)
    x = Dropout(0.6)(x)
    x = Dense(4096, activation='relu')(x)
    x = Dropout(0.6)(x)
    x = Lambda(lambda  x_: K.l2_normalize(x,axis=1))(x)
    return Model(inputs=resnet.input, outputs=x)
enhanced_resent = base_model()

这是它给我的错误。

Layer dense_3 was called with an input that isn't a symbolic tensor. Received type: <class 'keras.layers.core.Dense'>. Full input: [<keras.layers.core.Dense object at 0x000001C61E68E2E8>]. All inputs to the layer should be tensors.

我希望您能对此问题提供任何指导,但我真的不知道我是否可以这样做。 非常感谢

1 个答案:

答案 0 :(得分:0)

我戒了一个小时终于明白了。所以这就是你要怎么做。

def base_model():
    resnet = load_model("../Models/fine_tuned_model/42-0.85.h5")
    x = resnet.layers[-2].output
    x = Dense(4096, activation='relu', name="FC1")(x)
    x = Dropout(0.6, name="FCDrop1")(x)
    x = Dense(4096, activation='relu', name="FC2")(x)
    x = Dropout(0.6, name="FCDrop2")(x)
    x = Lambda(lambda  x_: K.l2_normalize(x,axis=1))(x)
    return Model(inputs=resnet.input, outputs=x)
enhanced_resent = base_model()

这很好用。我希望这对其他人有所帮助,因为我以前从未在任何教程中见过。

x = resnet.layers[-2].output

这将获得所需的图层,但是您需要知道所需的图层位于哪个索引。 -2是我想要的特征提取的第二到最后一个FC层,而不是最终的分类。可以找到

model.summary()