如何在keras中为LSTM提供多层作为时间步长

时间:2019-07-27 05:09:01

标签: tensorflow keras lstm recurrent-neural-network

我想为lstm提供两个单独的神经网络,作为2个时间步。这是我的代码:

from numba import njit

@njit
def numba1(a, vals, out):
    m,n = a.shape
    for j in range(m):
        for i in range(n):
            e = a[j,i]
            if vals[j] < out[e]:
                out[e] = vals[j]
    return out

def func1(a, vals, outlen=None): # feed in output length as outlen if known
    if outlen is not None:
        N = outlen
    else:
        N = a.max()+1
    out = np.full(N,np.inf)
    return numba1(a, vals, out)

错误是:

input1 = Input(shape=(self.state_size,1))
input2 = Input(shape=(self.state_size,1))

out1 = Conv1D(12, 5, padding="SAME", activation="relu")(input1)
out1 = Flatten()(out1)
out1 = Dense(12, activation="relu")(out1)

out2 = Conv1D(12, 5, padding="SAME", activation="relu")(input2)
out2 = Flatten()(out2)
out2 = Dense(12, activation="relu")(out2)

out = CuDNNLSTM(1)([out1,out2])

指的是:

ValueError: Input 0 is incompatible with layer cu_dnnlstm_1: expected ndim=3, found ndim=2

我也尝试过:

out = CuDNNLSTM(1)([out1,out2])

我的输入形状是(none,4,1),我需要我的输出形状是(none,1)。显然,CuDNNLSTM的输入形状必须为(none,2,12),但是我很难将out1和out2连接起来

1 个答案:

答案 0 :(得分:1)

您将在中间维度上stack张量:

steps = Lambda(lambda x: K.stack(x, axis=1))([out1, out2])
out = CuDNNSLTM(1)(steps)

但是我不确定具有两个步骤的序列是否会带来常规图层无法获得的出色结果。