Keras-CNN输入形状不兼容

时间:2018-11-02 13:25:33

标签: python tensorflow keras

我正在使用二进制分类,而在使用CNN时,我的代码在 Keras Lstm 上运行良好,我收到输入形状不兼容错误。

这是我得到的值错误

ValueError:检查目标时出错:预期density_61具有3维,但数组的形状为(24,1)

这是我使用keras的CNN代码

model=Sequential()
inputBatch = inputBatch.reshape(24,30, 1)
model.add(Conv1D(64, 3, activation='relu', input_shape=(30, 1)))
model.add(Conv1D(64, 3, activation='relu'))
model.add(MaxPooling1D(pool_size=4,strides=None, padding='valid'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy',optimizer='adam',metrics=['accuracy'])
model.fit(inputBatch,ponlabel,batch_size=24,epochs=20,validation_data=(inputBatch, ponlabel))

我正在研究二进制分类,它将是正数或负数

这是我的lstm代码供参考

inputBatch =inputBatch.reshape(24,30,1)
model=Sequential()
model.add(LSTM(50, input_shape=(30, 1)))
model.add(Dense(1, activation="relu"))
model.compile(loss='mean_absolute_error',optimizer='adam')
model.fit(inputBatch,ponlabel,batch_size=24,epochs=100,verbose=1)

inputBatch类似于这样,它正在处理LSTM代码,但不适用于CNN,这是我分别用于两种代码训练的输入

[[    0.  1288.  1288.  2214. 11266.  6923.   420.     0.     0.  8123.
      0.  7619.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.     0.     0.     0.     0.     0. 11516.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  9929. 11501.  6573. 11266.  7566.  9963.  4420. 10936.  3657.
   7050.     0.   408. 11501.  9988.  9963.  8455.  2879.  9322.  2047.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 11956.  5222.     0.     0. 12106.  6481.     0.  7093. 13756.
  12152.     0.     0.     0.     0. 10173.     0.  5173. 13756.  9371.
      0.  9956.     0.     0.  9716.     0.     0.     0.     0.     0.]
 [    0.     0.   420.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 11501.  1916.  2073. 10936.  6312.     0. 10193. 10322.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  2879.  7852. 11501.  1934.   286. 11483.     0. 12004. 11118.
      0. 12007.  9917. 12111.  1520. 10364.     0.  8840.  4195.  2910.
  10773. 11386. 12117.  9321.     0.     0.     0.     0.     0.     0.]
 [    0.  7885.  7171.  1034. 11501.  3103.  5842.  4395. 11871.  3328.
   6719.  5407.  1087.  8935.  2937.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  8894.   450. 11516.  7353. 11501. 11502. 11499.     0.  1319.
  11693. 11501.  5735. 12111.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  1087.  9565.    23.     0.  3045.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  5015. 11501.  3306. 12111.  9307.  5050. 11501.  3306.     0.
   3306. 12111.  1981. 11516.   615. 11516.     0.  3925. 11956.  9371.
   9013.  4395. 12111.  5048.     0.  3925.     0.     0.     0.     0.]
 [    0.  1287.   420.  4070. 11087.  7410. 12186.  2387. 12111.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.   128.  2073. 10936.  6312.     0. 10193. 10322.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 10173.  9435.  1320.  9322. 12018.  1055.  8840.  6684. 12051.
   2879.     0. 12018.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  1570.  5466.  9322.    34. 11480.  1356. 11270.   420.  2153.
  12006.  5157.  8840.  1055. 11516.  7387.  2356.  2163.  2879.  5541.
   9443.  7441.  1295.  5473.     0.     0.     0.     0.     0.     0.]
 [    0.  5014.     0.     0.  3651.  1087.    63.  6153.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0. 10608. 10855.  9562.     0.     0.     0.  4202.     0.     0.
      0. 10818. 10818.  5842.     0.  9963.     0. 11516. 10464.  7491.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  5952.  6133.   450.  7520.  5842.  3412. 10400.  3412.  2149.
   4891.  2979.  3456.   505.  9929. 11501.  9322.  1836. 11501. 12111.
   3435. 11105. 11266.   420.  9322.    34.     0.     0.     0.     0.]
 [    0.  1570.  5466.  9322.    34. 11480.  1356. 11270.   420.  2153.
  12006.  5157.  8840.  1055. 11516.  7387.  2356.  2163.  2879.  5541.
   9443.  7441.  1295.  5473.     0.     0.     0.     0.     0.     0.]
 [    0.  7544.     0.  1709.   420. 10936.  5222.  5842. 10407.  6937.
  11329.  2937.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  7785.  8840.     0.   420.  8603. 12003.  2879.  1087.  2356.
   2390. 12111.     0.     0.     0.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  8695.  8744.   420.  8840.  6697.  9267. 11516. 11203.  2260.
   8840.  7309.     0. 11100.  6041.     0.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.  9307. 12003.  2879.  6398.  9372.  4614.  5222.     0.     0.
   2879. 10364.  6923.  4709.  4860. 11871.     0.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]
 [    0.     0.  2844.  1287.   420. 11501.   610. 11501.   596.     0.
  12111.  3690.  6343.  9963.     0.     0.  8840.     0.     0.     0.
      0.     0.     0.     0.     0.     0.     0.     0.     0.     0.]]

4 个答案:

答案 0 :(得分:2)

问题在于输出形状,因为您使用的是CNN,所以输出为3D(样本,宽度,通道),并且“密集”层将在最后一个尺寸上运行,从而为您提供3D输出。但是您需要2D输出,因此需要添加一个Flatten层:

model=Sequential()
model.add(Conv1D(64, 3, activation='relu', input_shape=(30, 1)))
model.add(Conv1D(64, 3, activation='relu'))
model.add(MaxPooling1D(pool_size=4,strides=None, padding='valid'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Conv1D(128, 3, activation='relu'))
model.add(Dropout(0.5))
model.add(Flatten())
model.add(Dense(1, activation='sigmoid'))

您可以通过执行model.summary()

比较此模型和原始模型的输出形状

答案 1 :(得分:0)

当输入图像的大小不同时,可能会出现这种错误。

添加更多信息(由于没有足够的代表,我不能说这),可能包括完整的堆栈跟踪信息。

答案 2 :(得分:0)

不兼容的输入形状归因于ponlabel。对于LSTM,其形状为(24,1)。但是CNN使用binary_crossentropy来弥补损失,因此它将有两个目标类别。这意味着对于CNN,ponlabels必须具有形状(24,2,1)。

答案 3 :(得分:0)

对于CNN丢失,您需要使用MSE或分类交叉熵