在Keras中进行升采样时会复制数据吗?

时间:2018-09-20 11:05:17

标签: machine-learning neural-network keras conv-neural-network

我试图了解有关喀拉拉邦的upsampling2d。我在下面写了一个简单的代码

model = Sequential()
(trainX, trainY), (testX, testY) = mnist.load_data()

input_shape = (trainX.shape[1], trainX.shape[2], 1)

layer1 = Conv2D(64, 3,input_shape=input_shape, padding='same', activation='relu')
layer2 = UpSampling2D ()
layer3 = Conv2D(128, 3, padding='same', activation='relu')
model.add(layer1)
model.add(layer2)
model.add(layer3)
model.compile(loss='binary_crossentropy', optimizer = 'adam')
print (model.summary())

我的输出是

Layer (type)                 Output Shape              Param #
=================================================================
conv2d_1 (Conv2D)            (None, 28, 28, 64)        640
_________________________________________________________________
up_sampling2d_1 (UpSampling2 (None, 56, 56, 64)        0
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 56, 56, 128)       73856
=================================================================
Total params: 74,496
Trainable params: 74,496
Non-trainable params: 0
_________________________________________________________________

问题:

我了解如何计算640和73856个参数,但是如果我看到上采样层,我会看到学习参数为零。在这种情况下,将26 * 26放大到56 * 56。相对于26 * 26,数据如何在56 * 56内部?

0 个答案:

没有答案