在keras模型中将原生tensorflow RNNLayer与辍学结合使用

时间:2018-11-11 17:35:11

标签: python tensorflow keras rnn

我在Keras中实现了一个模型,但是我需要在tensorflow中实现相同的模型。因此,我希望仅实现模型的RNN层,其余部分保持不变,也就是说,预测方法,拟合模型...都在keras中实现。因此,下面是代码:

Keras模型:

def emotion_model(max_seq_len, num_features, learning_rate, num_units_1, num_units_2, bidirectional, dropout, num_targets):
    # Input layer
    inputs = Input(shape=(max_seq_len, num_features))

    # 1st layer
    net = LSTM(num_units_1, return_sequences=True, dropout=dropout, recurrent_dropout=dropout)(net)

    # 2nd layer
    net = LSTM(num_units_2, return_sequences=True, dropout=dropout, recurrent_dropout=dropout)(net)

    # Output layer
    outputs = []
    out1 = TimeDistributed(Dense(1))(net)  # linear activation
    outputs.append(out1)
    if num_targets >= 2:
        out2 = TimeDistributed(Dense(1))(net)  # linear activation
        outputs.append(out2)
    if num_targets == 3:
        out3 = TimeDistributed(Dense(1))(net)  # linear activation
        outputs.append(out3)

    # Create and compile model
    rmsprop = RMSprop(lr=learning_rate)
    model   = Model(inputs=inputs, outputs=outputs)
    model.compile(optimizer=rmsprop, loss=ccc_loss)  # CCC-based loss function
    return model

现在,我想用张量流中的等效代码替换上面的LSTM层。因此,在另一个模块中,我实现了以下内容:

def baseline_model(inputs, cell_Size1, cell_Size2, dropout):
    with tf.variable_scope('model', reuse=tf.AUTO_REUSE):
        cell1 = tf.nn.rnn_cell.LSTMCell(cell_Size1)
        cell1 = tf.nn.rnn_cell.DropoutWrapper(cell1, input_keep_prob=1.0 - dropout, state_keep_prob=1.0 - dropout)

        cell2 = tf.nn.rnn_cell.LSTMCell(cell_Size2)
        cell2 = tf.nn.rnn_cell.DropoutWrapper(cell2, input_keep_prob=1.0 - dropout, state_keep_prob=1.0 - dropout)

        cell = tf.nn.rnn_cell.MultiRNNCell([cell1, cell2], state_is_tuple=True)

        # output1: shape=[1, time_steps, 32]
        output, new_state = tf.nn.dynamic_rnn(cell, inputs, dtype=tf.float32)

        return output

我尝试了net = Lambda(partial(baseline_model, dropout))(net),在其中我从方法“ baseline_model”参数中删除了cell_size1cell_size2,但没有用

第二,我尝试直接转储在tensorflow中实现的LSTM层,而不是上面的keras中的LSTM层,但这不能解决我的问题。

非常感谢您的帮助!

0 个答案:

没有答案