如何在喀拉拉邦使用GRU之前的时间分布?

时间:2018-12-07 09:09:04

标签: python tensorflow keras

构建模型CNN时,输入尺寸为(none,100,100,1),输出尺寸为(400 * 1),但是当我运行模型时,会发生一些错误,这是我的模型:

visible_image1= Input(shape=(100,100,1))
conv_1=Conv2D(filters = 64, kernel_size = (5,5),padding = 'Same',  
             )(visible_image1)
BatchNor_1=BatchNormalization()(conv_1)
relu_1=LeakyReLU(0.2)(BatchNor_1)
pool_1=(MaxPool2D(pool_size=(3,3), strides=(3,3)))(relu_1)
conv_2=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(pool_1)
BatchNor_2=BatchNormalization()(conv_2)
relu_2=LeakyReLU(0.2)(BatchNor_2)
conv_3=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_2)
BatchNor_3=BatchNormalization()(conv_3) 
relu_3=LeakyReLU(0.2)(BatchNor_3)  

conv_4=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )(relu_3) 
BatchNor_4=BatchNormalization()(conv_4)  

conv_5=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )( BatchNor_3) 
BatchNor_5=BatchNormalization()(conv_5) 
add_1=Add()([BatchNor_4,BatchNor_5])
relu_4=LeakyReLU(0.2)(add_1)  
conv_6=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_4)
BatchNor_6=BatchNormalization()(conv_6)  
relu_5=LeakyReLU(0.2)(BatchNor_6) 
conv_7=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_5)  
BatchNor_7=BatchNormalization()(conv_7) 
relu_6=LeakyReLU(0.2)(BatchNor_7)
conv_8=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )(relu_6) 
BatchNor_8=BatchNormalization()(conv_8)  
add_2=Add()([BatchNor_8, relu_4])
relu_7=LeakyReLU(0.2)(add_2)
conv_9=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_7)               
BatchNor_9=BatchNormalization()(conv_9) 
relu_8=LeakyReLU(0.2)(BatchNor_9)
conv_10=Conv2D(filters = 128, kernel_size = (5,5),padding = 'Same', 
             )(relu_8) 
BatchNor_10=BatchNormalization()(conv_10) 
relu_9=LeakyReLU(0.2)(BatchNor_10)
conv_11=Conv2D(filters = 256, kernel_size = (5,5),padding = 'Same', 
             )(relu_9)
BatchNor_11=BatchNormalization()(conv_11) 
add_3=Add()([BatchNor_11, relu_7])
relu_10=LeakyReLU(0.2)(add_3)
time_1=TimeDistributed(Dense(256))(relu_10)   #
gru_1=GRU(256, return_sequences=True)(time_1)
flatten_1 = Flatten()(gru_1)  
fc_1=Dense(3000,activation = "relu")(flatten_1)
fc_2=Dense(1000,activation = "relu")(fc_1)
fc_3=Dense(401,activation = "softmax")(fc_2)

错误:

Input 0 is incompatible with layer gru_3: expected ndim=3, found ndim=4

据我所知,relu_10输出的dim是(none 33 33 256),并且在时间分布后,该亮度应该是3D,因为gru层应该有3D输入,我的问题是我应该如何制作尺寸,因为时间分布后是3D层?

时间分布的功能是什么?

1 个答案:

答案 0 :(得分:0)

我只想重复文章“ enter link description here”的结果,但是我无法解决“时间分布”的问题,我想知道本文中还使用了什么层? GRU仅用于查找输入数据之间的关系。我真的想知道mu代码有什么问题,因为我对Kears有了新的认识,我已经编码了两个多星期了。