我已经在Keras中对卷积层进行了一些实验。为此,我建立了一个卷积自动编码器。
卷积自动编码器工作正常且符合预期。但是我注意到在添加Upsampling2D图层后,它变得随机了。
我在加载keras之前设置了numpy种子和tensorflow随机种子。当我这样做时,我所有的模型都变得确定了,我可以重现结果。除了上面提到的带有Upsampling2D的模型。这层有特殊处理吗?
下面是我的自动编码器代码。
input_img = Input(shape=(8,8,1,)) # adapt this if using `channels_first` image data format
x= Conv2D(1, (3, 3), activation='tanh', padding='same')(input_img)
#x=BatchNormalization()(x)
x = MaxPooling2D((2, 2), padding='same')(x)
#x=Dropout(.1)(x)
x = Conv2D(10, (3, 3), activation='tanh', padding='same')(x)
encodeder1 = MaxPooling2D((2, 2), padding='same')(x)
x = Conv2DTranspose(10, (3, 3), activation='tanh', padding='same')(encodeder1)
x = UpSampling2D((2, 2))(x)
#x=Dropout(.1)(x)
x = Conv2DTranspose(1, (3, 3), activation='tanh' , padding='same')(x)
#x=BatchNormalization()(x)
x = UpSampling2D((2, 2) )(x)
#x=Dropout(.1)(x)
decoded = Conv2DTranspose(1, (3, 3), activation='sigmoid', padding='same')(x)
autoencoder = Model(input_img, decoded)
以下是我不使用UpSampling2D的结果,
运行1:
Epoch 1/3
67343/67343 [==============================] - 5s 79us/step - loss: 0.6744
Epoch 2/3
67343/67343 [==============================] - 4s 58us/step - loss: 0.5914
Epoch 3/3
67343/67343 [==============================] - 4s 62us/step - loss: 0.5706
运行2:
Epoch 1/3
67343/67343 [==============================] - 5s 79us/step - loss: 0.6744
Epoch 2/3
67343/67343 [==============================] - 4s 58us/step - loss: 0.5914
Epoch 3/3
67343/67343 [==============================] - 4s 62us/step - loss: 0.5706
这是使用UpSampling2D时的结果:
运行1:
Epoch 1/3
67343/67343 [==============================] - 6s 88us/step - loss: 0.7458
Epoch 2/3
67343/67343 [==============================] - 4s 61us/step - loss: 0.6745
Epoch 3/3
67343/67343 [==============================] - 5s 75us/step - loss: 0.6281
运行2:
Epoch 1/3
67343/67343 [==============================] - 6s 93us/step - loss: 0.7344
Epoch 2/3
67343/67343 [==============================] - 5s 76us/step - loss: 0.6833
Epoch 3/3
67343/67343 [==============================] - 5s 77us/step - loss: 0.6078
有解决方案吗?
谢谢