Keras负尺寸大小Conv1D错误

时间:2020-10-21 14:04:08

标签: python tensorflow machine-learning keras

我正在尝试改进我的模型,并阅读了一篇有关“使用CNN和Bi-LSTM”进行改进的文章。 现在,我正在尝试使代码正常工作,但是我被卡住了。 我收到的错误是

ValueError: Negative dimension size caused by subtracting 2 from 1 for '{{node time_distributed/conv1d/conv1d}} = Conv2D[T=DT_FLOAT, data_format="NHWC", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="VALID", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true](time_distributed/conv1d/conv1d/ExpandDims, time_distributed/conv1d/conv1d/ExpandDims_1)' with input shapes: [?,1,1,20], [1,2,20,24].

我的输入形状如下:

X (13800, 1, 20)
y (13800, 1)

模型代码如下:

def ml_model(train_x,train_y):
    # prep data
    train_y = train_y
    train_x = train_x
    # define parameters
    verbose, epochs, batch_size = 0, 250, 24 #250 epoch
    n_timesteps, n_features, n_outputs = train_x.shape[1], train_x.shape[2], train_y.shape[1]
    # reshape output
    train_y = train_y.reshape((train_y.shape[0], train_y.shape[1], 1))
    model = Sequential()
    model.add(TimeDistributed(Conv1D(filters=24, kernel_size=2, activation='relu'), input_shape=(None, n_timesteps, n_features)))
    model.add(TimeDistributed(MaxPooling1D(pool_size=2)))
    model.add(TimeDistributed(Conv1D(filters=64, kernel_size=2,activation='relu')))
    model.add(TimeDistributed(MaxPooling1D(pool_size=2)))
    model.add(TimeDistributed(Flatten()))
    model.add(Bidirectional(LSTM(64, activation='relu',return_sequences=True)))
    model.add(Bidirectional(LSTM(64, activation='relu')))
    model.add(Dense(128))
    model.add(Dropout(0.5))
    model.add(Dense(1))
    model.compile(optimizer='adam', loss='mse')

    model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, verbose=verbose)
    return model

0 个答案:

没有答案