将Downsample层预先添加到Resnet50预训练模型

时间:2016-11-23 04:12:36

标签: keras

我在Windows 7中使用keras 1.1.1和tensorflow后端。

我正在尝试使用图像下采样器预先存储Resnet50预备模型。以下是我的代码。

select t.*,
       row_number() over (partition by shift, crew order by date) as consecutiveshift,
       row_number() over (partition by crew order by date) as consecutiveday
from t;

但我收到一个错误,我不确定如何解决。

  

builtins.Exception:Graph disconnected:无法获取张量值   在图层输出(“input_2:0”,shape =(?,400,400,1),dtype = float32)   “INPUT_2”。访问以下先前的图层时没有问题:   []

1 个答案:

答案 0 :(得分:0)

您可以同时使用Functional API和Sequential方法来解决此问题。请参见以下两种方法的工作示例:

from keras.applications.ResNet50 import ResNet50
from keras.models import Sequential, Model
from keras.layers import AveragePooling2D, Flatten, RepeatVector, Reshape, ZeroPadding2D, Input, Dense

pretrained = ResNet50(input_shape=(224, 224, 3), weights='imagenet', include_top=False)

# Sequential method
model_1 = Sequential()
model_1.add(AveragePooling2D(pool_size=(2,2),input_shape=(400, 400, 1)))
model_1.add(Flatten())
model_1.add(RepeatVector(3))
model_1.add(Reshape((200, 200, 3)))
model_1.add(ZeroPadding2D(padding=(12,12)))
model_1.add(pretrained)
model_1.add(Dense(1))

# functional API method
input = Input(shape=(400, 400, 1))
x = AveragePooling2D(pool_size=(2,2),input_shape=(400, 400, 1))(input)
x = Flatten()(x)
x = RepeatVector(3)(x)
x = Reshape((200, 200, 3))(x)
x = ZeroPadding2D(padding=(12,12))(x)
x = pretrained(x)
preds = Dense(1)(x)

model_2 = Model(input,preds)

model_1.summary()
model_2.summary()

摘要(替换用于接收的Resnet):

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
average_pooling2d_1 (Average (None, 200, 200, 1)       0
_________________________________________________________________
flatten_1 (Flatten)          (None, 40000)             0
_________________________________________________________________
repeat_vector_1 (RepeatVecto (None, 3, 40000)          0
_________________________________________________________________
reshape_1 (Reshape)          (None, 200, 200, 3)       0
_________________________________________________________________
zero_padding2d_1 (ZeroPaddin (None, 224, 224, 3)       0
_________________________________________________________________
xception (Model)             (None, 7, 7, 2048)        20861480
_________________________________________________________________
dense_1 (Dense)              (None, 7, 7, 1)           2049
=================================================================
Total params: 20,863,529
Trainable params: 20,809,001
Non-trainable params: 54,528
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_2 (InputLayer)         (None, 400, 400, 1)       0
_________________________________________________________________
average_pooling2d_2 (Average (None, 200, 200, 1)       0
_________________________________________________________________
flatten_2 (Flatten)          (None, 40000)             0
_________________________________________________________________
repeat_vector_2 (RepeatVecto (None, 3, 40000)          0
_________________________________________________________________
reshape_2 (Reshape)          (None, 200, 200, 3)       0
_________________________________________________________________
zero_padding2d_2 (ZeroPaddin (None, 224, 224, 3)       0
_________________________________________________________________
xception (Model)             (None, 7, 7, 2048)        20861480
_________________________________________________________________
dense_2 (Dense)              (None, 7, 7, 1)           2049
=================================================================
Total params: 20,863,529
Trainable params: 20,809,001
Non-trainable params: 54,528
_________________________________________________________________

两种方法都能正常工作。如果您打算冻结预训练的模型并让前/后层学习-然后对模型进行微调,那么我发现可行的方法如下:

# given the same resnet model as before...
model = load_model('modelname.h5')

# pull out the nested model
nested_model = model.layers[5] # assuming the model is the 5th layer

# loop over the nested model to allow training
for l in nested_model.layers:
  l.trainable=True

# insert the trainable pretrained model back into the original
model.layer[5] = nested_model
相关问题