我们是否需要在tf.keras中声明control_dependencies

时间:2020-01-02 09:32:58

标签: tensorflow keras deep-learning

我在 tensorflow 中注意到,control_dependencies很重要,我们需要注意这一点。
但是,当我使用张量流keras实现模型时,我注意到如果不添加control_dependencies,似乎还可以。

例如,假设我有一个具有4层LSTM的模型,如下所示:

import numpy as np

sequence_input = tf.keras.layers.Input(dtype='int32', shape=(3,))
embedding_output = tf.keras.layers.Embedding(input_dim=100, output_dim=5, input_length=10)(sequence_input)
lstm_output = tf.keras.layers.LSTM(10, return_sequences=True)(embedding_output)
lstm_output = tf.keras.layers.LSTM(10, return_sequences=True)(lstm_output)
lstm_output = tf.keras.layers.LSTM(10, return_sequences=True)(lstm_output)
lstm_output = tf.keras.layers.LSTM(10, return_sequences=False)(lstm_output)
output = tf.keras.layers.Activation('sigmoid')(lstm_output)
model = tf.keras.models.Model(inputs=[sequence_input], outputs=output)
print(model.summary())
sequence_input = np.random.randint(100, size=(5, 3))
print(model.predict([sequence_input]))

在这里,我在嵌入层之上有4-lstm层。输出是正常的S型激活。因此,我认为该模型的实现是正确的。

但是,我认为我应该使用control_dependencies来实现这一点,

import numpy as np

sequence_input = tf.keras.layers.Input(dtype='int32', shape=(3,))
embedding_output = tf.keras.layers.Embedding(input_dim=100, output_dim=5, input_length=10)(sequence_input)
lstm_output = tf.keras.layers.LSTM(10, return_sequences=True)(embedding_output)
with tf.control_dependencies([lstm_output]):
    lstm_output = tf.keras.layers.LSTM(10, return_sequences=True)(lstm_output)
    with tf.control_dependencies([lstm_output]):
        lstm_output = tf.keras.layers.LSTM(10, return_sequences=True)(lstm_output)
        with tf.control_dependencies([lstm_output]):
            lstm_output = tf.keras.layers.LSTM(10, return_sequences=False)(lstm_output)
            with tf.control_dependencies([lstm_output]):
                output = tf.keras.layers.Activation('sigmoid')(lstm_output)

model = tf.keras.models.Model(inputs=[sequence_input], outputs=output)
print(model.summary())
sequence_input = np.random.randint(100, size=(5, 3))
print(model.predict([sequence_input]))

尽管如此,代码在两种情况下都可以正常运行,并且我认为在两种情况下该实现也是正确的。
因此,我的问题是:我们需要在tf keras中声明control_dependencies吗?另外,如果不是-为什么我们不需要在control_dependencies中使用tf.keras

谢谢

1 个答案:

答案 0 :(得分:0)

具体来说-确保您在 control_dependencies 参数中指定的内容在您在with块中定义的任何内容之前先进行评估

请参阅This以了解更多信息。