对于有状态LSTM,Keras中有没有办法使y大小不等于X批处理大小?

时间:2020-04-21 07:27:41

标签: keras lstm lstm-stateful

我只尝试在批次结束时更新重量,我知道这是默认行为,但是我不明白为什么您需要将X和y设置为相同的大小?如果我有X.shape(12,32,64),其中批处理大小为12,那么仅一个批处理为什么不足以拥有y.shape(1,N)?

仅在向网络显示整个批次大小后,我才想反向传播。为什么每个批处理项目都有标签?

示例代码:

def create_model(batch, timesteps, features):
    inputTensor1 = Input(batch_shape=(batch, timesteps, features))
    lstm1 = LSTM(32, stateful=True, dropout=0.2)(inputTensor1)
    x = Dense(4, activation='linear')(lstm1)
    model = Model(inputs=inputTensor1, outputs=x)
    model.compile(loss='mse', optimizer='rmsprop', metrics=['mse'])
    print(model.summary())
    plot_model(model, show_shapes=True, show_layer_names=True)

    return model

X = np.load("").reshape(1280,12,640,32)
y = np.load("").reshape(1280,1,4)

prop_train = 0.8
ntrain = int(X.shape[0]*prop_train)

X_train, X_val = X[:ntrain], X[ntrain:]
y_train, y_val = y[:ntrain], y[ntrain:]

model =create_model(12,640,32)

for j in np.arange(1):
    for i in np.arange(X_train.shape[0]):
        print(i)
        model.reset_states()
        history=model.train_on_batch(X_train[i], y_train[i])

我在这里遇到错误

ValueError: Input arrays should have the same number of samples as target arrays. Found 12 input samples and 1 target samples

0 个答案:

没有答案